Saturday, October 31, 2009

World’s Largest Telco Calls on NewVoiceMedia for Cloud Applications

Basingstoke, 29th October 2009. NewVoiceMedia today announced that it has signed an agreement with China Telecom (Europe) Ltd. (CTE) to promote the NewVoiceMedia Hosted Contact Centre Solution, ContactWorld, across Europe. CTE is the European, Middle Eastern and African operation of China Telecom, the world’s largest fixed-line operator.

NewVoiceMedia’s ContactWorld uses the increasingly popular Software-as-a-Service (SaaS) business model, allowing CTE to help companies to set up, or expand their European contact centres faster and more cost-effectively than traditional hardware-based approaches.

Unlike conventional contact centres, which may take months to install and configure and which can involve hardware costs that are out of reach for many new operations, NewVoiceMedia’s 'hosted contact centres' use virtual telephony systems. This provides the links and call plan intelligence between the callers and the agents. Call Centre owners pay ‘on-demand’, dramatically reducing the capital expenditure required and allowing the flexibility for the business to expand its call centre operations as it generates greater revenues.

Mr Ou Yan, Managing Director of CTE said: “Geared to our service philosophy of ‘Customer First, Service Foremost’, we continuously make the effort to explore next generation technologies that can add value to our customers’ business operations. Working in synergy with NewVoiceMedia, I am confident that the hosted ContactWorld platform, along with our comprehensive range of industry solutions, will provide our customers with more flexibility in managing their business operations and help them achieve better results.”

According to Drew Kraus, Research Vice President of Gartner, “When considering cost and functionality, many companies are finding SaaS-based contact centres to be the more effective solution for their needs.”

Announcing the deal, Jonathan Grant, CEO of NewVoiceMedia, expressed his delight that the world’s largest fixed-line telco saw the benefit of the ContactWorld platform for its rapidly growing European client base.

Mr Grant added: “As the face of the global economy changes rapidly, CTE’s presence globally is becoming more and more important. Many of their domestic clients are moving overseas and our suite of hosted telephony solutions is ideal to their needs, as it provides a quick and cost effective way to setup and operate a call centre.”

Monday, October 26, 2009

New Xbox: Full body and gesture recognition



Microsoft's Project Natal is the first game console without a controller. The system has a 3D camera that maps the exacts position of your hands, your fingers, your feet, your header, your nose, everything in a 3D map. This allows you to control the game with only your body, in great detail, and no controller needed. Furthermore, it recognises voice and faces and supports complex video chat.

Steven Spielberg: . "This is a pivotal moment that will carry with it a wave of change, the ripples of which will reach far beyond video games"

Sunday, October 25, 2009

PPC Strategies, Twitter Interview, SEO Plugin- Weekly Wra...



Recap of activity:

- 4 Ways to Make Money with Pay Per Click Search Engines

- Twitter Marketing Expert Interview with Warren Whitlock

- 2 More Great SEO Plug-ins for Firefox

- Discussing how you can maximize the use of Facebook, LinkedIn, Twitter and other social media profile pages to maximize exposure and extend your marketing reach, getting started consideratios for PPC Campaign Mangement, and referencing an expert interview with a social media user with 8700+ LinkedIn Connections.

Sunday, October 18, 2009

SAMSUNG Flexible AM OLED

Future Designer laptop - ROLLTOP, Diploma Thesis



The device of the flexible display allows a new concept in notebook design growing out of the traditional book formed laptop into unfurling and convolving portable computer.

By virtue of the OLED-Display technology and a multi touch screen the utility of a laptop computer with its weight of a mini-notebook and screen size of 13 inch easily transforms into the graphics tablet, which with its 17-inch flat screen can be also used as a primary monitor.

On top of everything else all computer utilities from power supply through the holding belt to an interactive pen are integrated in Roll top. This is really an all-in-one gadget.

VMware: five biggest challenges of server virtualization

Although the benefits of virtualizing x86 servers have been pushed relentlessly for the past five years or so, much less discussed have been the challenges involved in moving to a world where resources are pooled and everything is linked.

The complexity that such a scenario generates can have a knock-on effect on issues ranging from infrastructure and licensing to skills, which means that migrating to the new environment can end up being an expensive upfront proposition.

Adrian Polley, chief executive at IT services provider Plan-Net, says, "You are often talking about a complete change in infrastructure, which is why people who started on this path before the recession may have continued, but not many have plunged in since."

A key challenge is that virtualization involves sharing resources, whether that relates to hosts, storage or networks, but changing one element of the whole can have repercussions elsewhere.

"All of this sharing means that if you give to one thing, you take away from something else, so it becomes a balancing act to understand how resources should be properly allocated," Polley says. "There are always bottlenecks and you can end up just moving them around. Because things are so interconnected, you can end up chasing your tail."

As a result, we have come up with a guide to help you work your way through the mire. Below we look at five of the biggest challenges relating to x86 server virtualization and what you can do about them.

1. Network connections

"If the network is not up to snuff, you are in trouble from the start. But the bad thing is that, if you have virtualized your servers without doing your homework, you will not know whether it is the network that is to blame for performance issues or something else," says Dan Hidlebaugh, network server manager at Hertford Regional College.

The educational establishment virtualized its x86 servers about two years ago in a bid to cut escalating utility bills, reduce its carbon footprint and improve its disaster recovery provision.

A campus-wide agreement with Microsoft meant that licensing fees were lower than those of rival vendors. So it agreed to become a European test site for the supplier's Hyper-V offering, helped by IBM, which provided the college with a free six-month trial of its BladeCenters. The organization has now consolidated its 120 physical servers down to about 55 virtual servers and expects more to follow.

But Hidlebaugh warns that the success of such projects is not just dependent on ensuring that the virtualization software works effectively.

"You have to look at what hardware you want to use, the storage area network (San), how you connect the two, how they connect to the network, how the network reaches the end-user, etc," he says. "You can have a great virtualization platform, but if clients cannot access it due to a network bottleneck, it is useless."

The college had already decided to upgrade its network as part of a planned move to new premises and undertook a thorough review. As a result, it introduced an enterprise-class Cisco router, a dual-band wireless network and 10Gbit network-to-edge switches to connect the system to users in each classroom. Twelve core fiber cables were also laid for redundancy purposes and the network was tested "mercilessly" for a month to push it to its limits.

Another performance consideration, however, related to the communications backplane of the host.

"We had to ensure that the servers' backplane could handle the same speeds as the router. If you just throw memory and processing power at it but are stuck with a 1Gbit network connection, you will end up with big performance issues,"
says Hidlebaugh. The BladeCenters in question have a backplane of 700Gbits.

2. Network storage
A further concern when going down the virtualization route relates to storage. Hypervisor suppliers generally recommend implementing network storage such as Sans for larger production deployments, particularly if organisations are keen to deploy high-availability tools such as VMware's VMotion. Direct attached storage may suffice for smaller development and test environments, however.

VMotion enables the automatic migration of workloads between different servers should one crash or need to be taken down for maintenance. But this activity requires that virtual machines be stored as disc images in the San. Each host on the network needs to be able to see each disc image to understand when and where to assign spare processing capacity should it be required.

But Sans - and personnel with the appropriate skills - are expensive to acquire, especially if organizations opt for higher performance fiber channel-based systems rather than cheaper ISCSI equivalents.

Even if such a system is already in place, it may be necessary to upgrade it to ensure that performance is adequate and that all components are certified to run in a virtualized environment, which is not always the case. Checking suppliers' hardware compatibility lists is a must, as is following configuration recommendations.

3. Sizing storage capacity
Another must is to size the San adequately, not least to guard against wasting money by over-provisioning the system. Such a consideration is also important in light of the fact that some organizations find their applications run more slowly in the wake of a virtualization implementation, despite their use of server-based memory management techniques such as page sharing.

Hidlebaugh says, "Disc issues tend to be the problem." The challenge in this context is that virtual machines generate a high number of I/O requests to be processed each second, but the San's physical discs may be unable to keep up.

One way of getting around the problem is to use workload analysis and planning tools such as Novell's Platespin. These tools evaluate what level of capacity is likely to be required for a virtualized environment based on the profile of current physical servers in terms of memory, disc, processor and network bandwidth usage.

An array that supports mixed workloads can also help. I/O-intensive applications such as databases and high-throughput software, such as backup, all appear as a single big workload to the array despite their different requirements.

But because priority is given to processing big blocks of data, smaller I/O-based sequential transactions are generally made to wait, which negatively affects their performance. A system able to handle both kinds of workloads simultaneously can help to address the issue, however.

4. Back-up challenges
Many organizations continue to back up their virtualized server environments in the same way as their physical servers, but this approach has its downsides. A key challenge relates to the fact that such activity in a physical environment is often undertaken by software agents that are installed on host operating systems and back up both applications and data to either disc or tape.

The problem with doing things this way in a virtual world is that virtual machines consist of complete logical environments that include not just the applications and data, but also the VM file system. Because traditional software does not back up the VM file system, should the virtual machine go down, it is necessary to rebuild the file system from scratch. The system must then be restored and configured and the relevant data and applications copied over to run on it.

Northern Ireland-based car dealership company Isaac Agnew was unhappy with the time-consuming nature of this process and so introduced specialist back-up tools from Veeam.

The organization initially virtualized a Dell blade server in the latter half of 2007 to try the technology out, but is now running 20 virtual machines on three VMware ESX-based machines used mainly for development and test purposes.

Tim Carter, senior systems administrator at Isaac Agnew, says, "Before Veeam, we had scripts that one of the team had written to automatically copy some files from the virtual machine onto one of the servers that was backed up periodically using CommVault. But we had to manually choose what to synchronise into the back-up folder, and if we missed something, we were in trouble."

Backing up each virtual machine would have meant purchasing a back-up license for each machine on which they ran, which was considered too expensive.

But the snapshotting capabilities of the new tools now mean that, "We can restore the file system in a minute as opposed to hours of rebuilding the virtual machine and copying files, which often resulted in staffing having to do overtime in the evenings and weekends," Carter says.

Although more storage capacity is needed to back up virtual machines in this way, the compression functionality provided by the tools mitigates this requirement nicely, he adds.

5. Application support

Although most applications will run in a virtualized environment, obtaining full support is another matter. There will be no problem with packages that are certified as "virtualization-ready", but some suppliers are unwilling to commit themselves to this approach either because they have not fully tested their software on virtualized hosts, or because their applications have already run into some kind of problem in the virtualized environment.

Other companies offer a kind of half-way house service in that users will be requested to reproduce any faults on a physical server if it is suspected that the issue is associated with the move to virtualization.

As a result, Hertford College's Hidlebaugh believes that it is necessary for organisations to go through "a whole process" to decide which applications are suitable candidates for migration and which are not.

"Suppliers of things like domain controllers told us that their applications were not proven yet and so to please wait. There are about 30 of our servers that we are not going to virtualize and about 10 of them relate to applications that have not been tested," he says.

"It is crucial to talk to your suppliers and anyone else who is supporting your applications,"
Hidlebaugh warns, otherwise you could end up putting yourself at risk.

He would also be wary about virtualizing I/O-intensive applications such as Hertford College's Microsoft SQL Server databases and Exchange e-mail servers without heavy amounts of testing due to San-related performance issues.

Skills
The knock-on effects of moving to a world where everything is interconnected do not end here. Another important thing to think about is skills, particularly in large enterprises, where IT staff tend to specialize in key functional areas such as storage, servers and networking.

Because all of these areas begin to overlap in the virtualized world, it is easy to end up in a scenario where support is duplicated in some areas but falls through the gaps in others. It is crucial to clearly delineate roles and decide on who is responsible for what. It may also be necessary to train personnel across the IT department in new disciplines.

Plan-Net's Polley says, "The skills issue is hard to overstate because people end up having to have a much greater breadth of knowledge. They really do need to be expert in a bunch of areas if they are going to solve problems in a virtualized world successfully."

'Endpoint' business security market to generate over $16bn by 2014, says research

The market for protecting business laptops, desktops and mobile devices with anti virus, encryption and firewall software will reach $16.4bn by 2014 according to a new security report by Juniper Research. By that date network-based security provision to the business "endpoint" will account for close to 20% of the market, says Juniper.

Despite the ongoing move towards so-called "endpoint solutions", where IT security staff can manage the security on a company's desktops and other devices remotely through the network, demand for business IT security products will be driven by the need for encryption products, as devices become both more mobile and more likely to carry sensitive company information.

"From desktop to the mobile device, the encryption function will play an increasingly important role as both governments and businesses realize the vulnerability of data on corporate machines and the importance of data protection," says Juniper analyst Anthony Cox.

Data protection legislation covering personal information on company phones and other office hardware will lend further support to the encryption market, particularly in the US, Europe and Japan. Indeed, though Endpoint and Unified Threat Management (UTM) options have been successfully sold to the market it is still difficult to provide all the elements of a business security solution in one package, particularly if mobile devices are to be protected as well.

Other findings from the report:

* Overall business security market (excluding mobile) will be worth $15.7bn by 2014
* Encryption is set to increase at a rate of 26% to $4.3bn in 2014, largely on the back of legislation governing data protection.


The report looks at how the mobile and desktop business IT security market will develop and in which directions. The report forecasts both the number of corporate users protected as well as revenues for six areas of security: Basic antivirus and firewall security; Encryption and advanced authorisation functions; Advanced firewalls, Intrusion detection and prevention, Virtual private networks (VPN); Security policy management, endpoint security updates and provision; Security of corporate mobile devices.

Friday, October 16, 2009

Apple Makes It Easier for Free iPhone Apps to Make Money


Apple said Thursday that it will let iPhone application developers offer their users the option to buy additional content or features within a free app on its App Store.

App developers said they received an email notice from Apple informing them that the in-app purchase feature was now available for free apps and that it would “simplify your development by creating a single version of your app that uses in App Purchase to unlock additional functionality, eliminating the need to create Lite versions of your app.” A spokeswoman for Apple confirmed the news.

The in-app purchase feature, which was first introduced in March, allows developers to offer fresh content for purchase within an app such as new levels in a game, additional books in an e-book app, or expanded capability in productivity apps. The caveat, however, was that the feature was only available for paid apps, which meant that developers had to charge at least 99 cents.

Developers say that the latest announcement helps in two ways. First, it makes it much easier for them to make a business out of free apps. Until now, developers sold ads within their free apps or tried to convert users to a paid version with more content.

“The reality is that the vast majority of apps have been free. If you were going to monetize your app, you were always going up against free apps,” said Jamie Berger, senior vice president of an IGN Entertainment division that provides digital distribution services to developers.

The new capability could also help clean up the App Store because it would make it less necessary to offer both a free “lite” version and a paid version of the same app, a strategy that many developers used to try to make money.

“This is really big news because we’ve been having conversations with Apple as have other publishers,”
said Clive Downie, vice president of marketing at Ngmoco, which publishes iPhone games. “This enables increased choice for customers.”

Saturday, October 10, 2009

ITU TELECOM WORLD

Geneva, 5-9 October 2009.

Only held every few years, ITU TELECOM WORLD is an unrivaled event for the global telecommunication and information communication technology (ICT) sector. Forward-looking, WORLD 2009 attracts all stakeholders from across the sector and around the world for a truly global, world-class event and networking platform.

ITU is the leading United Nations agency for information and communication technologies. As the global focal point for governments and the private sector, ITU's role in helping the world communicate spans 3 core sectors: radiocommunication, standardization and development. ITU also organizes TELECOM events and was the lead organizing agency of the World Summit on the Information Society.

ITU is based in Geneva, Switzerland, and its membership includes 191 Member States and more than 700 Sector Members and Associates.

Wednesday, October 7, 2009

ID card officials back away from scandal-hit database


Government plans to store ID card biometrics data on a controversial system used by thousands of public workers might be scrapped.

The Home Office has confirmed it is reconsidering plans to use the Customer Information System system to store biometric data for the ID card scheme.

The Customer Information System (CIS) - which is run by the Department for Work and Pensions (DWP) - has yet to meet the Cabinet Office's latest standards on IT security, we have learned.

As revealed in August that thirty four council staff accessed the CIS database to snoop on the personal records of celebrities and acquaintances. Nine of the council workers were sacked.

The CIS database holds information on 85 million citizens, and is the government's main citizen database. It is available to 140,000 users from eight government departments, and to 445 local authorities.

But it is proving difficult for the Department of Work of Pensions to allow thousands of public workers and local authorities to access the CIS Oracle-based database, yet keep it demonstrably secure.

The Home Office revealed plans to use the CIS system for ID cards in December 2006 in its Strategic Action Plan for the National Identity Scheme.

In the Strategic Action Plan for the National Identity Scheme, the Home Office said: "We plan to use DWP's Customer Information System (CIS) technology, subject to the successful completion of technical feasibility work," for National Identity Register biographical information.

It added: "DWP's CIS technology is already used to hold records for everyone who has a National Insurance number - i.e. nearly everyone in the UK."

The Home Office planned to separate DWP's citizen data on the CIS information from the biometrics store being built up on the National Identity Register.

Now the government plans to avoid using CIS for the ID card scheme, if possible. A spokesman for the Home Office said using CIS is no more than an option for the future.

He said the possibility of using CIS will not be considered until the system has full security accreditation, which is due in 2010 at the earliest.

The Home Office will store biometric information for ID cards on a database run by Thales, one of the main contractors for the ID card scheme.

Officials had planned to use CIS for the ID card scheme to save money. It would have allowed the government to avoid building an entirely new system and security architecture.

But we have learned that the security of the CIS has been so discredited that officials are keen to distance the ID card scheme from it, even if this means paying for a new system from scratch.

Sunday, October 4, 2009

MotionX-GPS Drive: low cost iPhone navigation


Californian start-up Fullpower Technologies, who has been quite successful with its GPS outdoor and sport iPhone applications has launched a few days ago its own turn-by-turn navigation software on the iTunes store.


MotionX-GPS Drive is an off-board application based on NAVTEQ map data and deCarta's geospatial platform which also offers Bing Local search (Microsoft) and traffic information, but no text-to-speech to pronounce street names.

Selling at $2.99 for one month, or $24.99 for one year, this application is to date the lowest price point for turn-by-turn navigation on the App Store. In comparison Gokivo (NIM) and AT&T Navigator (TeleNav) cost $9.99 per month and on-board systems (with lifetime license) span from $34.99 (ALK Technologies) to $99 (TomTom).

According to Wall Street Journal tech guru Walt Mossberg, who tested MotionX-GPS Drive prior to its release: “This app worked well in my tests, and is packed with features, including live traffic, a route summary, and integrated music control. It understood my D.C. test address, but it doesn’t announce street names, and its function buttons are very small and labeled with tiny type.” Not a bad review for a first navigation application.

Top questions:

Q1. "In what countries can I use MotionX-GPS Drive for navigation?"

A1. Currently, the US and Canada are supported. Watch MotionX.com for future releases with more functionality.