Thursday, December 31, 2009
Friday, December 25, 2009
Thursday, December 24, 2009
Apple Tablet Debut in January?
A report in the Financial Times blog said Apple has rented space for an event next month in San Francisco.
The arrival of a new tablet hasn't been this anticipated since that Moses guy came down from the mountain top with two of them. Rumors have swirled for months that Apple is readying a new kind of tablet computer geared toward interacting with multimedia content and surfing the Web.
While Apple (NASDAQ: AAPL) has remained characteristically mum on the topic, some reports claiming insider knowledge of Apple's supply chain have said to expect the tablet's release around March of 2010. And last week a video surfaced of a purported Apple tablet prototype.
Now a post at the Financial Times blog today have stoked the fires some more, claiming Apple has rented a stage at the Yerba Buena Center for the Arts in San Francisco for Tuesday, January 26. Apple has used this location in the past and is expected to make a "major product announcement" on that date, said the blog, citing people it said were familiar with Apple's plans.
The FT quoted well-known Apple watcher and Piper Jaffray analyst Gene Munster, as saying he though an event was imminent. "We believe there is a 75 percent likelihood that Apple will have an event in January and a 50 percent chance that it will be held to launch the Apple Tablet," he wrote in a research note, quoted by FT. "If Apple announced the Tablet in January, it would likely ship later in the March quarter."
Since Apple doesn't currently sell a tablet, it could afford to pre-announce the device to help generate developer and consumer interest without directly impacting sales of its current product line, though theoretically such an announcement could affect some potential iPhone and MacBook buyers.
If current rumors are to be believed, Apple will use a 10.1-inch multitouch display using the same LTPS LCD technology used in the iPhone. The "iTablet" is also thought to support running windowed applications simultaneously, as opposed to the single-task, full-screen operation of the iPhone and iPod Touch.
Apple used the Yerba Buena Center venue last September when it rolled out new iPods
Saturday, November 21, 2009
About Blue Ocean Strategy
Companies have long engaged in head-to-head competition in search of sustained, profitable growth. They have fought for competitive advantage, battled over market share, and struggled for differentiation.
Yet in today’s overcrowded industries, competing head-on results in nothing but a bloody “red ocean” of rivals fighting over a shrinking profit pool. In a book that challenges everything you thought you knew about the requirements for strategic success, W. Chan Kim and Renée Mauborgne contend that while most companies compete within such red oceans, this strategy is increasingly unlikely to create profitable growth in the future.
Based on a study of 150 strategic moves spanning more than a hundred years and thirty industries, Kim and Mauborgne argue that tomorrow’s leading companies will succeed not by battling competitors, but by creating “blue oceans” of uncontested market space ripe for growth. Such strategic moves—termed “value innovation”—create powerful leaps in value for both the firm and its buyers, rendering rivals obsolete and unleashing new demand.
Blue Ocean Strategy provides a systematic approach to making the competition irrelevant. In this frame-changing book, Kim and Mauborgne present a proven analytical framework and the tools for successfully creating and capturing blue oceans. Examining a wide range of strategic moves across a host of industries, Blue Ocean Strategy highlights the six principles that every company can use to successfully formulate and execute blue ocean strategies. The six principles show how to reconstruct market boundaries, focus on the big picture, reach beyond existing demand, get the strategic sequence right, overcome organizational hurdles, and build execution into strategy.
Upending traditional thinking about strategy, Blue Ocean Strategy charts a bold new path to winning the future.
Value Innovation
Value Innovation is the cornerstone of blue ocean strategy. Value innovation is the simultaneous pursuit of differentiation and low cost. Value innovation focuses on making the competition irrelevant by creating a leap of value for buyers and for the company, thereby opening up new and uncontested market space. Because value to buyers comes from the offering’s utility minus its price, and because value to the company is generated from the offering’s price minus its cost, value innovation is achieved only when the whole system of utility, price and cost is aligned.
In the Blue Ocean Strategy methodology, the Four Actions Framework and ERRC grid assist managers in breaking the value-cost trade off by answering the following questions:
What factors can be eliminated that the industry has taken for granted?
What factors can be reduced well below the industry’s standard?
What factors can be raised well above the industry’s standard?
What factors can be created that the industry has never offered?
Red Ocean vs Blue Ocean
Strategy Canvas
The strategy canvas is the central diagnostic and action framework for building a compelling blue ocean strategy. The horizontal axis captures the range of factors that the industry competes on and invests in, and the vertical axis captures the offering level that buyers receive across all these key competing factors.
The strategy canvas serves two purposes:
Firstly, it captures the current state of play in the known market space. This allows you to understand where the competition is currently investing and the factors that the industry competes on.
Secondly, it propels you to action by reorienting your focus from competitors to alternatives and from customers to noncustomers of the industry.
The value curve is the basic component of the strategy canvas. It is a graphic depiction of a company's relative performance across its industry's factors of competition.
As you can see on the diagram above, what makes a good value curve is focus, divergence as well as a compelling tagline.
4 Actions Framework
To reconstruct buyer value elements in crafting a new value curve, we use the Four Actions Framework. As shown in the diagram above, to break the trade-off between differentiation and low cost and to create a new value curve, there are four key questions to challenge an industry's strategic logic and business model:
Which of the factors that the industry takes for granted should be eliminated?
Which factors should be reduced well below the industry's standard?
Which factors should be raised well above the industry's standard?
Which factors should be created that the industry has never offered?
ERRC Grid
The Eliminate-Reduce-Raise-Create Grid (ERRC) is complementary with the four actions framework. It pushes companies not only to ask all four questions in the four actions framework but also to act on all four to create a new value curve, essential for unlocking a new blue ocean. By driving companies to fill in the grid with the actions of eliminating and reducing as well as raising and creating, the grid gives companies four immediate benefits:
It pushes them to simultaneously pursue differentiation and low cost to break the value-cost trade off.
It immediately flags companies that are focused only on raising and creating and thereby lifting the cost structure and often overengineering products and services - a common plight in many companies.
It is easily understood by managers at any level, creating a high level of engagement in its application.
Because completing the grid is a challenging task, it drives companies to robustly scrutinize every factor the industry competes on, making them discover the range of implicit assumptions the make unconsciously in competing.
Pioneer-Migrator-Settler Map
A useful exercise for a corporate management team pursuing profitable growth is to plot the company's current and planned portfolios on the pioneer-migrator-settler (PMS) map. For the purpose of the exercise, settlers are defined as me-too businesses, migrators are business offerings better than most in the marketplace, and a company's pioneers are the businesses that offer unprecedented value. These are your blue ocean strategies, and are the most powerful sources of profitable growth. They are the only ones with a mass following of customers.
If both the current portfolio and the planned offerings consist mainly of settlers, the company has a low growth trajectory, is largely confined to red oceans, and needs to push for value innovation. Although the company might be profitable today as its settlers are still making money, it may well have fallen into the trap of competitive benchmarking, imitation, and intense price competition.
If current and planned offerings consist of a lot of migrators, reasonable growth can be expected. But the company is not exploiting its potential for growth, and risks being marginalized by a company that value-innovates. In our experience the more an industry is populated by settlers, the greater the opportunity to value-innovate and create a blue ocean of new market space.
This exercise is especially valuable for managers who want to see beyond today's performance. Revenue, profitability, market share, and customer satisfaction are all measures of a company's current position. Contrary to what conventional strategic thinking suggests, those measures cannot point the way to the future; changes in the environment are too rapid. Today's market share is a reflection of how well a business has performed historically.
Clearly, what companies should be doing is shifting the balance of their future portfolio toward pioneers. That is the path to profitable growth. The PMS map above depicts this trajectory, showing the scatter plot of a company's portfolio of businesses, where the gravity of its current portfolio of twelve businesses, expressed as twelve dots, shifts from a preponderance of settlers to a stronger balance of migrators and pioneers.
Buyer Experience Cycle / Buyer Utility Map
The buyer utility map helps to get managers thinking from the right perspective. It outlines all the levers companies can pull to deliver utility to buyers as well as the different experiences buyers can have of a product or service. This lets managers identify the full range of utility propositions that a product or service can offer. Let’s look at the map’s dimension in detail.
The six stages of the buyer experience cycle. A buyer's experience can usually be broken down into a cycle of six distinct stages, running more or less sequentially from purchase to disposal. Each stage encompasses a wide variety of specific experiences. Purchasing, for example, includes the experience of browsing Amazon.com as well as the experience of pushing a shopping cart through Wal-Mart’s aisles.
The six utility levers. Cutting across the stages of the buyer’s experience are what we call the levers of utility – the ways in which companies unlock utility for their customers. Most of the levers are obvious. Simplicity, fun and image, and environmental friendliness need little explanation. Nor does the idea that a product could reduce a buyer’s financial or physical risks. And a product or service offers convenience simply by being easy to obtain and or use. The most commonly used lever – but perhaps the least obvious- is that of customer productivity. An innovation can increase productivity by helping them do things faster, better, or in different ways. The financial information company Bloomberg, for example, makes traders more efficient by offering on-line analytics that analyze and compare the raw information it delivers.
By locating a new product on one of the 36 spaces of the buyer utility map, managers can clearly see how the new idea creates a different utility proposition from existing products. In our experience, managers all too often focus on delivering more of the same stage of the buyer’s experience. That approach may be reasonable in emerging industries, where there’s plenty of room for improving a company’s utility proposition. But in many existing industries, this approach is unlikely to produce a market-shaping blue ocean strategy.
3 Tiers of Noncustomers
Typically, to grow their share of a market, companies strive to retain and expand existing customers. This often leads to finer segmentation and greater tailoring of offerings to better meet customer preferences. The more intense the competition is, the greater, on average, is the resulting customization of offerings. As companies compete to embrace customer preferences through finer segmentation, they often risk creating too-small target markets.
To maximize the size of their blue oceans, companies need to take a reverse course. Instead of concentrating on customers, they need to look to noncustomers. And instead of focusing on customer differences, they need to build on powerful commonalities in what buyers value. That allows companies to reach beyond existing demand to unlock a new mass of customers that did not exist before.
Although the universe of noncustomers typically offers big blue ocean opportunities, few companies have keen insight into who noncustomers are and how to unlock them. To convert this huge latent demand into real demand in the form of thriving new customers, companies need to deepen their understanding of the universe of noncustomers.
There are three tiers of noncustomers that can be transformed into customers. They differ in their relative distance from your market. The first tier of noncustomers is closest to your market. They sit on the edge of the market. They are buyers who minimally purchase an industry’s offering out of necessity but are mentally noncustomers of the industry. They are waiting to jump ship and leave the industry as soon as the opportunity presents itself. However, if offered a leap in value, not only would they stay, but also their frequency of purchases would multiply, unlocking enormous latent demand.
The second tier of noncustomers is people who refuse to use your industry’s offerings. These are buyers who have seen your industry’s offerings as an option to fulfill their needs but have voted against them.
The third tier of noncustomers is farthest from your market. They are noncustomers who have never thought of your market’s offerings as an option. By focusing on key commonalities across these noncustomers and existing customers, companies can understand how to pull them into their new market.
Sequence of Blue Ocean Strategy
Companies need to build their Blue Ocean Strategy in the sequence of buyer utility, price, cost, and adoption. Have you got the strategic sequence right? Click on the picture on the left to find out the process.
4 Hurdles to Execution
Once a company has developed a blue ocean strategy with a profitable business model, it must execute it. The challenge of execution exists, of course, for any strategy. Companies, like individuals, often have a tough time translating thought into action whether in red or blue oceans.
The challenges managers face are steep. They face four hurdles:
A cognitive hurdle. waking employees up to the need for a strategic shift. Red oceans may not be the paths to future profitable growth, but they feel comfortable to people and may have even served an organization well until now, so why rock the boat?
Limited resources. The greater the shift in strategy, the greater it is assumed are the resources needed to execute it. But many companies find resources in notoriously short supply
Motivation. How do you motivate key players to move fast and tenaciously to carry out a break from the status quo?
Politics. As one manager put it, “In our organization you get shot down before you stand up.”
Although all companies face different degrees of these hurdles, and many may face only some subset of the four, knowing how to triumph over them is key to attenuating organizational risk.
To achieve this effectively, however, companies must abandon perceived wisdom on effecting change. Conventional wisdom asserts that the greater the change, the greater the resources and time you will need to bring about results. Instead, you need to flip conventional wisdom on its head using what we call tipping point leadership. Tipping point leadership allows you to overcome these four hurdles fast and at low cost while winning employees’ backing in executing a break from the status quo.
The key questions answered by tipping point leaders are as follows: What factors or acts exercise a disproportionately positive influence on breaking the status quo? On getting the maximum bang out of each buck of resources? On motivating key players to aggressively move forward with change? And on knocking down political roadblocks that often trip up even the best strategies? By single-mindedly focusing on points of disproportionate influence, tipping point leaders can topple the four hurdles that limit execution of blue ocean strategy. They can do this fast and at low cost.
Three ε Principles of Fair Process
What is fair process? Fair process builds execution into strategy by creating people's buy-in up front. When fair process is exercised in the strategy making process, people trust that a level playing field exists. This inspires them to cooperate voluntarily in executing the resulting strategic decisions.
There are three mutually reinforcing elements that define fair process: engagement, explanation, and clarity of expectation. Whether people are senior executives or shop employees, they all look to these elements. We call them the three Ε principles of fair process.
Conventional Wisdom vs Tipping Point Leadership
The conventional theory of organizational change rests on transforming the mass. So change efforts are focused on moving the mass, requiring steep resources and long time frames — luxuries few executives can afford. Tipping point leadership, by contrast, takes a reverse course. To change the mass it focuses on transforming the extremes: the people, acts, and activities that exercise a disproportionate influence on performance. By transforming the extremes, tipping point leaders are able to change the core fast and at low cost to execute their new strategy.
Yet in today’s overcrowded industries, competing head-on results in nothing but a bloody “red ocean” of rivals fighting over a shrinking profit pool. In a book that challenges everything you thought you knew about the requirements for strategic success, W. Chan Kim and Renée Mauborgne contend that while most companies compete within such red oceans, this strategy is increasingly unlikely to create profitable growth in the future.
Based on a study of 150 strategic moves spanning more than a hundred years and thirty industries, Kim and Mauborgne argue that tomorrow’s leading companies will succeed not by battling competitors, but by creating “blue oceans” of uncontested market space ripe for growth. Such strategic moves—termed “value innovation”—create powerful leaps in value for both the firm and its buyers, rendering rivals obsolete and unleashing new demand.
Blue Ocean Strategy provides a systematic approach to making the competition irrelevant. In this frame-changing book, Kim and Mauborgne present a proven analytical framework and the tools for successfully creating and capturing blue oceans. Examining a wide range of strategic moves across a host of industries, Blue Ocean Strategy highlights the six principles that every company can use to successfully formulate and execute blue ocean strategies. The six principles show how to reconstruct market boundaries, focus on the big picture, reach beyond existing demand, get the strategic sequence right, overcome organizational hurdles, and build execution into strategy.
Upending traditional thinking about strategy, Blue Ocean Strategy charts a bold new path to winning the future.
Value Innovation
Value Innovation is the cornerstone of blue ocean strategy. Value innovation is the simultaneous pursuit of differentiation and low cost. Value innovation focuses on making the competition irrelevant by creating a leap of value for buyers and for the company, thereby opening up new and uncontested market space. Because value to buyers comes from the offering’s utility minus its price, and because value to the company is generated from the offering’s price minus its cost, value innovation is achieved only when the whole system of utility, price and cost is aligned.
In the Blue Ocean Strategy methodology, the Four Actions Framework and ERRC grid assist managers in breaking the value-cost trade off by answering the following questions:
What factors can be eliminated that the industry has taken for granted?
What factors can be reduced well below the industry’s standard?
What factors can be raised well above the industry’s standard?
What factors can be created that the industry has never offered?
Red Ocean vs Blue Ocean
Strategy Canvas
The strategy canvas is the central diagnostic and action framework for building a compelling blue ocean strategy. The horizontal axis captures the range of factors that the industry competes on and invests in, and the vertical axis captures the offering level that buyers receive across all these key competing factors.
The strategy canvas serves two purposes:
Firstly, it captures the current state of play in the known market space. This allows you to understand where the competition is currently investing and the factors that the industry competes on.
Secondly, it propels you to action by reorienting your focus from competitors to alternatives and from customers to noncustomers of the industry.
The value curve is the basic component of the strategy canvas. It is a graphic depiction of a company's relative performance across its industry's factors of competition.
As you can see on the diagram above, what makes a good value curve is focus, divergence as well as a compelling tagline.
4 Actions Framework
To reconstruct buyer value elements in crafting a new value curve, we use the Four Actions Framework. As shown in the diagram above, to break the trade-off between differentiation and low cost and to create a new value curve, there are four key questions to challenge an industry's strategic logic and business model:
Which of the factors that the industry takes for granted should be eliminated?
Which factors should be reduced well below the industry's standard?
Which factors should be raised well above the industry's standard?
Which factors should be created that the industry has never offered?
ERRC Grid
The Eliminate-Reduce-Raise-Create Grid (ERRC) is complementary with the four actions framework. It pushes companies not only to ask all four questions in the four actions framework but also to act on all four to create a new value curve, essential for unlocking a new blue ocean. By driving companies to fill in the grid with the actions of eliminating and reducing as well as raising and creating, the grid gives companies four immediate benefits:
It pushes them to simultaneously pursue differentiation and low cost to break the value-cost trade off.
It immediately flags companies that are focused only on raising and creating and thereby lifting the cost structure and often overengineering products and services - a common plight in many companies.
It is easily understood by managers at any level, creating a high level of engagement in its application.
Because completing the grid is a challenging task, it drives companies to robustly scrutinize every factor the industry competes on, making them discover the range of implicit assumptions the make unconsciously in competing.
Pioneer-Migrator-Settler Map
A useful exercise for a corporate management team pursuing profitable growth is to plot the company's current and planned portfolios on the pioneer-migrator-settler (PMS) map. For the purpose of the exercise, settlers are defined as me-too businesses, migrators are business offerings better than most in the marketplace, and a company's pioneers are the businesses that offer unprecedented value. These are your blue ocean strategies, and are the most powerful sources of profitable growth. They are the only ones with a mass following of customers.
If both the current portfolio and the planned offerings consist mainly of settlers, the company has a low growth trajectory, is largely confined to red oceans, and needs to push for value innovation. Although the company might be profitable today as its settlers are still making money, it may well have fallen into the trap of competitive benchmarking, imitation, and intense price competition.
If current and planned offerings consist of a lot of migrators, reasonable growth can be expected. But the company is not exploiting its potential for growth, and risks being marginalized by a company that value-innovates. In our experience the more an industry is populated by settlers, the greater the opportunity to value-innovate and create a blue ocean of new market space.
This exercise is especially valuable for managers who want to see beyond today's performance. Revenue, profitability, market share, and customer satisfaction are all measures of a company's current position. Contrary to what conventional strategic thinking suggests, those measures cannot point the way to the future; changes in the environment are too rapid. Today's market share is a reflection of how well a business has performed historically.
Clearly, what companies should be doing is shifting the balance of their future portfolio toward pioneers. That is the path to profitable growth. The PMS map above depicts this trajectory, showing the scatter plot of a company's portfolio of businesses, where the gravity of its current portfolio of twelve businesses, expressed as twelve dots, shifts from a preponderance of settlers to a stronger balance of migrators and pioneers.
Buyer Experience Cycle / Buyer Utility Map
The buyer utility map helps to get managers thinking from the right perspective. It outlines all the levers companies can pull to deliver utility to buyers as well as the different experiences buyers can have of a product or service. This lets managers identify the full range of utility propositions that a product or service can offer. Let’s look at the map’s dimension in detail.
The six stages of the buyer experience cycle. A buyer's experience can usually be broken down into a cycle of six distinct stages, running more or less sequentially from purchase to disposal. Each stage encompasses a wide variety of specific experiences. Purchasing, for example, includes the experience of browsing Amazon.com as well as the experience of pushing a shopping cart through Wal-Mart’s aisles.
The six utility levers. Cutting across the stages of the buyer’s experience are what we call the levers of utility – the ways in which companies unlock utility for their customers. Most of the levers are obvious. Simplicity, fun and image, and environmental friendliness need little explanation. Nor does the idea that a product could reduce a buyer’s financial or physical risks. And a product or service offers convenience simply by being easy to obtain and or use. The most commonly used lever – but perhaps the least obvious- is that of customer productivity. An innovation can increase productivity by helping them do things faster, better, or in different ways. The financial information company Bloomberg, for example, makes traders more efficient by offering on-line analytics that analyze and compare the raw information it delivers.
By locating a new product on one of the 36 spaces of the buyer utility map, managers can clearly see how the new idea creates a different utility proposition from existing products. In our experience, managers all too often focus on delivering more of the same stage of the buyer’s experience. That approach may be reasonable in emerging industries, where there’s plenty of room for improving a company’s utility proposition. But in many existing industries, this approach is unlikely to produce a market-shaping blue ocean strategy.
3 Tiers of Noncustomers
Typically, to grow their share of a market, companies strive to retain and expand existing customers. This often leads to finer segmentation and greater tailoring of offerings to better meet customer preferences. The more intense the competition is, the greater, on average, is the resulting customization of offerings. As companies compete to embrace customer preferences through finer segmentation, they often risk creating too-small target markets.
To maximize the size of their blue oceans, companies need to take a reverse course. Instead of concentrating on customers, they need to look to noncustomers. And instead of focusing on customer differences, they need to build on powerful commonalities in what buyers value. That allows companies to reach beyond existing demand to unlock a new mass of customers that did not exist before.
Although the universe of noncustomers typically offers big blue ocean opportunities, few companies have keen insight into who noncustomers are and how to unlock them. To convert this huge latent demand into real demand in the form of thriving new customers, companies need to deepen their understanding of the universe of noncustomers.
There are three tiers of noncustomers that can be transformed into customers. They differ in their relative distance from your market. The first tier of noncustomers is closest to your market. They sit on the edge of the market. They are buyers who minimally purchase an industry’s offering out of necessity but are mentally noncustomers of the industry. They are waiting to jump ship and leave the industry as soon as the opportunity presents itself. However, if offered a leap in value, not only would they stay, but also their frequency of purchases would multiply, unlocking enormous latent demand.
The second tier of noncustomers is people who refuse to use your industry’s offerings. These are buyers who have seen your industry’s offerings as an option to fulfill their needs but have voted against them.
The third tier of noncustomers is farthest from your market. They are noncustomers who have never thought of your market’s offerings as an option. By focusing on key commonalities across these noncustomers and existing customers, companies can understand how to pull them into their new market.
Sequence of Blue Ocean Strategy
Companies need to build their Blue Ocean Strategy in the sequence of buyer utility, price, cost, and adoption. Have you got the strategic sequence right? Click on the picture on the left to find out the process.
4 Hurdles to Execution
Once a company has developed a blue ocean strategy with a profitable business model, it must execute it. The challenge of execution exists, of course, for any strategy. Companies, like individuals, often have a tough time translating thought into action whether in red or blue oceans.
The challenges managers face are steep. They face four hurdles:
A cognitive hurdle. waking employees up to the need for a strategic shift. Red oceans may not be the paths to future profitable growth, but they feel comfortable to people and may have even served an organization well until now, so why rock the boat?
Limited resources. The greater the shift in strategy, the greater it is assumed are the resources needed to execute it. But many companies find resources in notoriously short supply
Motivation. How do you motivate key players to move fast and tenaciously to carry out a break from the status quo?
Politics. As one manager put it, “In our organization you get shot down before you stand up.”
Although all companies face different degrees of these hurdles, and many may face only some subset of the four, knowing how to triumph over them is key to attenuating organizational risk.
To achieve this effectively, however, companies must abandon perceived wisdom on effecting change. Conventional wisdom asserts that the greater the change, the greater the resources and time you will need to bring about results. Instead, you need to flip conventional wisdom on its head using what we call tipping point leadership. Tipping point leadership allows you to overcome these four hurdles fast and at low cost while winning employees’ backing in executing a break from the status quo.
The key questions answered by tipping point leaders are as follows: What factors or acts exercise a disproportionately positive influence on breaking the status quo? On getting the maximum bang out of each buck of resources? On motivating key players to aggressively move forward with change? And on knocking down political roadblocks that often trip up even the best strategies? By single-mindedly focusing on points of disproportionate influence, tipping point leaders can topple the four hurdles that limit execution of blue ocean strategy. They can do this fast and at low cost.
Three ε Principles of Fair Process
What is fair process? Fair process builds execution into strategy by creating people's buy-in up front. When fair process is exercised in the strategy making process, people trust that a level playing field exists. This inspires them to cooperate voluntarily in executing the resulting strategic decisions.
There are three mutually reinforcing elements that define fair process: engagement, explanation, and clarity of expectation. Whether people are senior executives or shop employees, they all look to these elements. We call them the three Ε principles of fair process.
Conventional Wisdom vs Tipping Point Leadership
The conventional theory of organizational change rests on transforming the mass. So change efforts are focused on moving the mass, requiring steep resources and long time frames — luxuries few executives can afford. Tipping point leadership, by contrast, takes a reverse course. To change the mass it focuses on transforming the extremes: the people, acts, and activities that exercise a disproportionate influence on performance. By transforming the extremes, tipping point leaders are able to change the core fast and at low cost to execute their new strategy.
Thursday, November 19, 2009
Hewlett-Packard: 8 Weapons 3Com Brings to the HP-Cisco Brawl
HP’s massive $2.7 billion pick-up of 3Com will certainly turn the heat up on the company’s growing rivalry with Cisco. Channel Insider takes a look at which new capabilities added by 3Com will best help HP go toe-to-toe with Cisco.
3Com Open Network Program
HP has made it clear that the biggest differentiator it has developed in its battle against the Cisco networking juggernaut is a product base built on an open architecture. 3Com complements this strategy with its 3Com Open Network Program, through which the company has worked to develop relationships with ISVs, service providers, system integrators, consultants and customers to improve interoperability and open development in the networking environment.
Data Center Core Switch S12500
The coup de grace of the 3Com acquisition will be the added capability of core and aggregation switching, a needed complement to flesh out HP’s offering beyond the edge into the core. At the heart of this is the young H3C switch portfolio, including the flagship H3C S1250, doubles the performance of Cisco’s Nexus 7000 and eats up half the power of this rival.
Flex Chassis SwitchS5800
Similarly, HP is paying big bucks for the top-of-rack switching capabilities offered by 3Com, as evidenced by the H3C S5800, a flex-chassis switch that can be used as a modular chassis as well as a fixed-form-factor stackable switch.
MSR Family
Though ProCurve has helped HP beat up on Cisco at the edge within the SMB, HP’s still weak when it comes to enterprise edge routing. 3Com helps remedy the situation with the MSR family of routers, many of which will help HP take on Cisco’s ISR series.
Intelligent Management Center
HP’s existing Business Technology Optimization (BTO) suite will gain added firepower with the addition of Intelligent Management Center. An enterprise-class management system that scale to handle a high-density infrastructure, IMC is built on service-oriented architecture and can help enterprise customers consolidate network management within the largest of environments.
TippingPoint
While HP certainly bolstered its security practice with the 2007 acquisition of SPI Dynamics, it was still lacking strength in the area of network security. A venerable player in the IPS/IDS field, TippingPoint offers HP options to not only sell stand alone intrusion detection, but also build it into next generation networking equipment, a strategy that 3Com was already spinning up with H3C.
3Com VoIP
While there’s certainly some overlap with HP’s Halo, 3Com’s VoIP portfolio offers a more mature technology portfolio on which the folks in Palo Alto can draw upon if they get the integration right. Overall, this will be crucial in strengthening HP’s attack on Cisco’s UCC market share.
H3C S7506E
One of the big boons of the H3C portfolio in general is its energy efficiency. The S7506E is the greenest of the bunch, according to a recent report from independent performance testing firm Miercom. Miercom found that this switch has an annual operating cost that runs 24 percent lower than industry average.
3Com Open Network Program
HP has made it clear that the biggest differentiator it has developed in its battle against the Cisco networking juggernaut is a product base built on an open architecture. 3Com complements this strategy with its 3Com Open Network Program, through which the company has worked to develop relationships with ISVs, service providers, system integrators, consultants and customers to improve interoperability and open development in the networking environment.
Data Center Core Switch S12500
The coup de grace of the 3Com acquisition will be the added capability of core and aggregation switching, a needed complement to flesh out HP’s offering beyond the edge into the core. At the heart of this is the young H3C switch portfolio, including the flagship H3C S1250, doubles the performance of Cisco’s Nexus 7000 and eats up half the power of this rival.
Flex Chassis SwitchS5800
Similarly, HP is paying big bucks for the top-of-rack switching capabilities offered by 3Com, as evidenced by the H3C S5800, a flex-chassis switch that can be used as a modular chassis as well as a fixed-form-factor stackable switch.
MSR Family
Though ProCurve has helped HP beat up on Cisco at the edge within the SMB, HP’s still weak when it comes to enterprise edge routing. 3Com helps remedy the situation with the MSR family of routers, many of which will help HP take on Cisco’s ISR series.
Intelligent Management Center
HP’s existing Business Technology Optimization (BTO) suite will gain added firepower with the addition of Intelligent Management Center. An enterprise-class management system that scale to handle a high-density infrastructure, IMC is built on service-oriented architecture and can help enterprise customers consolidate network management within the largest of environments.
TippingPoint
While HP certainly bolstered its security practice with the 2007 acquisition of SPI Dynamics, it was still lacking strength in the area of network security. A venerable player in the IPS/IDS field, TippingPoint offers HP options to not only sell stand alone intrusion detection, but also build it into next generation networking equipment, a strategy that 3Com was already spinning up with H3C.
3Com VoIP
While there’s certainly some overlap with HP’s Halo, 3Com’s VoIP portfolio offers a more mature technology portfolio on which the folks in Palo Alto can draw upon if they get the integration right. Overall, this will be crucial in strengthening HP’s attack on Cisco’s UCC market share.
H3C S7506E
One of the big boons of the H3C portfolio in general is its energy efficiency. The S7506E is the greenest of the bunch, according to a recent report from independent performance testing firm Miercom. Miercom found that this switch has an annual operating cost that runs 24 percent lower than industry average.
After Playing Games, iPhone Gets Serious about Books
The iPhone is a versatile multi-media device that has already significantly impacted the business models of music, games and other Media & Entertainment industry categories.
In particular, since Apple launched the App Store in July 2008, game developers have flocked to the iPhone, creating an alternative for consumers to the leading handheld gaming platform, Nintendo DS.
In Nintendo's October 29 earnings call, the company cited iPhone competition against its DS as one of the reasons profits fell by more than half last quarter, from 133 billion yen a year prior to 64 billion yen, or $709 million.
To predict which sector of Media & Entertainment iPhone might next impact, Flurry researched the number of applications released to the App Store, by category, since its inception.
From August 2008 to August 2009, more apps were released in the Games category than any other.
This September, however, we observed another category, Books, usurping Games for the first time ever.
In October, one out of every five new apps launching in the iPhone has been a book. Publishers of all kinds, from small ones like Your Mobile Apps to mega-publishers like Softbank, are porting existing IP into the App Store at record rates.
Flurry first evaluated the iPhone as an eBook reader in its July Pulse ("You Trying to Swindle my Kindle?") where it looked at consumer demand for eBooks.
In that report, we observed that during the month of August 1% of the entire U.S. population was already reading a book on the iPhone. Now, with books shipping in droves, we are seeing the supply-side explode.
The sharp rise in eBook activity on the iPhone indicates that Apple is positioned take market share from the Amazon Kindle as it did from the Nintendo DS.
Despite the smaller form factor of the display, we predict that the iPhone will be a significant player in the book category of the Media & Entertainment space.
Further, with Apple working on a larger tablet form factor, running on the iPhone OS, we believe Jeff Bezos and team will face significant competition.
In particular, since Apple launched the App Store in July 2008, game developers have flocked to the iPhone, creating an alternative for consumers to the leading handheld gaming platform, Nintendo DS.
In Nintendo's October 29 earnings call, the company cited iPhone competition against its DS as one of the reasons profits fell by more than half last quarter, from 133 billion yen a year prior to 64 billion yen, or $709 million.
To predict which sector of Media & Entertainment iPhone might next impact, Flurry researched the number of applications released to the App Store, by category, since its inception.
From August 2008 to August 2009, more apps were released in the Games category than any other.
This September, however, we observed another category, Books, usurping Games for the first time ever.
In October, one out of every five new apps launching in the iPhone has been a book. Publishers of all kinds, from small ones like Your Mobile Apps to mega-publishers like Softbank, are porting existing IP into the App Store at record rates.
Flurry first evaluated the iPhone as an eBook reader in its July Pulse ("You Trying to Swindle my Kindle?") where it looked at consumer demand for eBooks.
In that report, we observed that during the month of August 1% of the entire U.S. population was already reading a book on the iPhone. Now, with books shipping in droves, we are seeing the supply-side explode.
The sharp rise in eBook activity on the iPhone indicates that Apple is positioned take market share from the Amazon Kindle as it did from the Nintendo DS.
Despite the smaller form factor of the display, we predict that the iPhone will be a significant player in the book category of the Media & Entertainment space.
Further, with Apple working on a larger tablet form factor, running on the iPhone OS, we believe Jeff Bezos and team will face significant competition.
Thursday, November 12, 2009
DRUK en PRINT
Design For IT is gespecialiseerd in de levering van kwalitatief drukwerk.
Naast drukwerk leveren wij ook indoor en outdoor reclamemateriaal.
Op deze manier is Design For IT uw algehele aanspreekpunt voor reclame.
Wij onderscheiden ons van de concurrenten door de uitgebreide service.
Zo kunnen we ook uw logo en komplete huisstijl ontwerpen en vervolgens, of direct reclame/marketing uitingen ontwerpen en opmaken.
Design For IT B.V.
KvK 37091026
BTW nummer 8091.85.477.B.01.7270
Tel. 06-50730710
Algemene mail designforit@gmail.com
Vragen? laat het ons weten
Staat uw drukwerk niet op dit blog? Heeft u een vraag over uw bestelling? Wilt u meer informatie over onze producten?
Onze helpdesk geeft u graag zo snel mogelijk antwoord op uw vragen.
Stuur ons een email naar dfiservices@live.nl en omschrijf uw vraag zo goed mogelijk.
We helpen u graag, en meestal binnen 24 uur.
Graag ontvangen wij bij een aanvraag van u de volgende gegevens;
contactpersoon
bedrijfsnaam
factuuradres
verzendadres
email adres
opmerkingen
Wilt u meteen geholpen worden? Dat kan.
Bel één van onze medewerkers op 06-50730710.
We zitten voor u klaar op maandag tot en met zondag, van 09.00 uur tot 17.00 uur (zaterdag en zondag van 10.00-16.00 uur).
BRIEFPAPIER Specificaties; A4, 4/0, 90 gr/m2, 1.000 stuks, incl. verzending 77,50 euro,
VISITEKAARTJES Specificaties; 5,5 x 8,5 cm, 4/4, 300 gr/m2, 250 stuks, incl. verzending 45,- euro KLAPKAARTJES nu ook verkrijgbaar!
ENVELOPPEN Specificaties; EA5, 15,6 x 22 cm, 90 gr/m2, striplock, venster links, 4/0, 1.000 stuks, incl.verzending 215,- euro
ENVELOPPEN Specificaties; C4, 22,9 x 32,4 cm, 120 gr/m2, striplock, venster links, 4/0, 1.000 stuks, incl. verzending 355,- euro
WITH COMPLIMENT CARDS Specificaties; din lang 11 x 22 cm, 300 gr/m2, 1.000 stuks, incl.verzending 97,- euro
ZELFKLEVENDE MEMO'S Specificaties; wit of geel papier, verschillende formaten, 25, 50 of 100 vel per blok, PMS of 4/0 bedrukt, 250 stuks, incl.verzending 230,- euro
STICKERS Specificaties; Sticker outdoor, ook leverbaar op etikettenpapier, A6 formaat, 250 stuks, 4/0, incl.verzending 145,- euro
OFFERTEMAPPEN Specificaties; 2 kleppen, 300 gr/m2 + dispersielak, 4/0, 1 mm vulhoogte, zonder venster, 250 stuks, incl.verzending 300,- euro
FOLDERS Din Lang Wikkel of Zig Zag gevouwen Specificaties; A4 naar Din Lang of Zig Zag gevouwen, 4/4, 135 gr/m2, 1.000 stuks, incl.verzending 115,- euro
Verder zijn ook nog mogelijk;
KERSTKAARTEN,
FLYERS op A4,5 en 6 formaat,
MAGAZINES met 8 pagina's,
FOLDERS van A4 naar A5 gevouwen,
KRANTEN,
TRESPABORDEN,
FOAMBORDEN,
V-BORDEN,
CANVAS,
VINYL SPANDOEK,
SWINGBORDEN,
BANNERS incl. HOUDER,
BURO ONDERLEGGERS,
DRIEHOEKSBORDEN,
POSTERS, ETC.....
Naast drukwerk leveren wij ook indoor en outdoor reclamemateriaal.
Op deze manier is Design For IT uw algehele aanspreekpunt voor reclame.
Wij onderscheiden ons van de concurrenten door de uitgebreide service.
Zo kunnen we ook uw logo en komplete huisstijl ontwerpen en vervolgens, of direct reclame/marketing uitingen ontwerpen en opmaken.
Design For IT B.V.
KvK 37091026
BTW nummer 8091.85.477.B.01.7270
Tel. 06-50730710
Algemene mail designforit@gmail.com
Vragen? laat het ons weten
Staat uw drukwerk niet op dit blog? Heeft u een vraag over uw bestelling? Wilt u meer informatie over onze producten?
Onze helpdesk geeft u graag zo snel mogelijk antwoord op uw vragen.
Stuur ons een email naar dfiservices@live.nl en omschrijf uw vraag zo goed mogelijk.
We helpen u graag, en meestal binnen 24 uur.
Graag ontvangen wij bij een aanvraag van u de volgende gegevens;
contactpersoon
bedrijfsnaam
factuuradres
verzendadres
email adres
opmerkingen
Wilt u meteen geholpen worden? Dat kan.
Bel één van onze medewerkers op 06-50730710.
We zitten voor u klaar op maandag tot en met zondag, van 09.00 uur tot 17.00 uur (zaterdag en zondag van 10.00-16.00 uur).
BRIEFPAPIER Specificaties; A4, 4/0, 90 gr/m2, 1.000 stuks, incl. verzending 77,50 euro,
VISITEKAARTJES Specificaties; 5,5 x 8,5 cm, 4/4, 300 gr/m2, 250 stuks, incl. verzending 45,- euro KLAPKAARTJES nu ook verkrijgbaar!
ENVELOPPEN Specificaties; EA5, 15,6 x 22 cm, 90 gr/m2, striplock, venster links, 4/0, 1.000 stuks, incl.verzending 215,- euro
ENVELOPPEN Specificaties; C4, 22,9 x 32,4 cm, 120 gr/m2, striplock, venster links, 4/0, 1.000 stuks, incl. verzending 355,- euro
WITH COMPLIMENT CARDS Specificaties; din lang 11 x 22 cm, 300 gr/m2, 1.000 stuks, incl.verzending 97,- euro
ZELFKLEVENDE MEMO'S Specificaties; wit of geel papier, verschillende formaten, 25, 50 of 100 vel per blok, PMS of 4/0 bedrukt, 250 stuks, incl.verzending 230,- euro
STICKERS Specificaties; Sticker outdoor, ook leverbaar op etikettenpapier, A6 formaat, 250 stuks, 4/0, incl.verzending 145,- euro
OFFERTEMAPPEN Specificaties; 2 kleppen, 300 gr/m2 + dispersielak, 4/0, 1 mm vulhoogte, zonder venster, 250 stuks, incl.verzending 300,- euro
FOLDERS Din Lang Wikkel of Zig Zag gevouwen Specificaties; A4 naar Din Lang of Zig Zag gevouwen, 4/4, 135 gr/m2, 1.000 stuks, incl.verzending 115,- euro
Verder zijn ook nog mogelijk;
KERSTKAARTEN,
FLYERS op A4,5 en 6 formaat,
MAGAZINES met 8 pagina's,
FOLDERS van A4 naar A5 gevouwen,
KRANTEN,
TRESPABORDEN,
FOAMBORDEN,
V-BORDEN,
CANVAS,
VINYL SPANDOEK,
SWINGBORDEN,
BANNERS incl. HOUDER,
BURO ONDERLEGGERS,
DRIEHOEKSBORDEN,
POSTERS, ETC.....
Monday, November 2, 2009
China claims supercomputer among world's fastest
''China is the future; american will wither away in it's own feces. Outsourcing, greed, incompetence, and sheer laziness allows other...''
China announced its fastest supercomputer yet Thursday in the country's latest show of its goal to become a world leader in technology.
China's National University of Defense Technology, a military academy, unveiled the machine that would have ranked fourth in the most recent Top500 list of the world's fastest supercomputers, state media said. The supercomputer, named Milky Way, can theoretically perform more than one million billion calculations per second, Xinhua news agency said. That figure, measured in "flops," or floating operation points per second, would make it China's first petaflop-class machine.
The machine's data has been submitted for ranking in the Top500 list, which is next due out in November, Xinhua said, citing faculty at the university in China's inland Hunan province. The computer will be used for bio-medical computing, seismic data processing during oil exploration and for the design of "aerospace vehicles," it said.
The computer has over 11,000 microprocessors from Intel and Advanced Micro Devices and cost at least 600 million yuan (US$88 million) to build, the agency said. It will be moved to a supercomputing center in the northeastern city of Tianjin later this year, Xinhua said.
Dawning, a Chinese government-backed hardware maker, is separately designing a petaflop supercomputer it hopes to deploy next year. That system is planned to use Godson CPUs, also known by the name Loongson, a domestic chip line designed with government funding to expand China's pool of domestically owned technology.
China-made CPUs will also be added to the Milky Way supercomputer in the future to further boost its speed, Xinhua said.
China announced its fastest supercomputer yet Thursday in the country's latest show of its goal to become a world leader in technology.
China's National University of Defense Technology, a military academy, unveiled the machine that would have ranked fourth in the most recent Top500 list of the world's fastest supercomputers, state media said. The supercomputer, named Milky Way, can theoretically perform more than one million billion calculations per second, Xinhua news agency said. That figure, measured in "flops," or floating operation points per second, would make it China's first petaflop-class machine.
The machine's data has been submitted for ranking in the Top500 list, which is next due out in November, Xinhua said, citing faculty at the university in China's inland Hunan province. The computer will be used for bio-medical computing, seismic data processing during oil exploration and for the design of "aerospace vehicles," it said.
The computer has over 11,000 microprocessors from Intel and Advanced Micro Devices and cost at least 600 million yuan (US$88 million) to build, the agency said. It will be moved to a supercomputing center in the northeastern city of Tianjin later this year, Xinhua said.
Dawning, a Chinese government-backed hardware maker, is separately designing a petaflop supercomputer it hopes to deploy next year. That system is planned to use Godson CPUs, also known by the name Loongson, a domestic chip line designed with government funding to expand China's pool of domestically owned technology.
China-made CPUs will also be added to the Milky Way supercomputer in the future to further boost its speed, Xinhua said.
Saturday, October 31, 2009
World’s Largest Telco Calls on NewVoiceMedia for Cloud Applications
Basingstoke, 29th October 2009. NewVoiceMedia today announced that it has signed an agreement with China Telecom (Europe) Ltd. (CTE) to promote the NewVoiceMedia Hosted Contact Centre Solution, ContactWorld, across Europe. CTE is the European, Middle Eastern and African operation of China Telecom, the world’s largest fixed-line operator.
NewVoiceMedia’s ContactWorld uses the increasingly popular Software-as-a-Service (SaaS) business model, allowing CTE to help companies to set up, or expand their European contact centres faster and more cost-effectively than traditional hardware-based approaches.
Unlike conventional contact centres, which may take months to install and configure and which can involve hardware costs that are out of reach for many new operations, NewVoiceMedia’s 'hosted contact centres' use virtual telephony systems. This provides the links and call plan intelligence between the callers and the agents. Call Centre owners pay ‘on-demand’, dramatically reducing the capital expenditure required and allowing the flexibility for the business to expand its call centre operations as it generates greater revenues.
Mr Ou Yan, Managing Director of CTE said: “Geared to our service philosophy of ‘Customer First, Service Foremost’, we continuously make the effort to explore next generation technologies that can add value to our customers’ business operations. Working in synergy with NewVoiceMedia, I am confident that the hosted ContactWorld platform, along with our comprehensive range of industry solutions, will provide our customers with more flexibility in managing their business operations and help them achieve better results.”
According to Drew Kraus, Research Vice President of Gartner, “When considering cost and functionality, many companies are finding SaaS-based contact centres to be the more effective solution for their needs.”
Announcing the deal, Jonathan Grant, CEO of NewVoiceMedia, expressed his delight that the world’s largest fixed-line telco saw the benefit of the ContactWorld platform for its rapidly growing European client base.
Mr Grant added: “As the face of the global economy changes rapidly, CTE’s presence globally is becoming more and more important. Many of their domestic clients are moving overseas and our suite of hosted telephony solutions is ideal to their needs, as it provides a quick and cost effective way to setup and operate a call centre.”
NewVoiceMedia’s ContactWorld uses the increasingly popular Software-as-a-Service (SaaS) business model, allowing CTE to help companies to set up, or expand their European contact centres faster and more cost-effectively than traditional hardware-based approaches.
Unlike conventional contact centres, which may take months to install and configure and which can involve hardware costs that are out of reach for many new operations, NewVoiceMedia’s 'hosted contact centres' use virtual telephony systems. This provides the links and call plan intelligence between the callers and the agents. Call Centre owners pay ‘on-demand’, dramatically reducing the capital expenditure required and allowing the flexibility for the business to expand its call centre operations as it generates greater revenues.
Mr Ou Yan, Managing Director of CTE said: “Geared to our service philosophy of ‘Customer First, Service Foremost’, we continuously make the effort to explore next generation technologies that can add value to our customers’ business operations. Working in synergy with NewVoiceMedia, I am confident that the hosted ContactWorld platform, along with our comprehensive range of industry solutions, will provide our customers with more flexibility in managing their business operations and help them achieve better results.”
According to Drew Kraus, Research Vice President of Gartner, “When considering cost and functionality, many companies are finding SaaS-based contact centres to be the more effective solution for their needs.”
Announcing the deal, Jonathan Grant, CEO of NewVoiceMedia, expressed his delight that the world’s largest fixed-line telco saw the benefit of the ContactWorld platform for its rapidly growing European client base.
Mr Grant added: “As the face of the global economy changes rapidly, CTE’s presence globally is becoming more and more important. Many of their domestic clients are moving overseas and our suite of hosted telephony solutions is ideal to their needs, as it provides a quick and cost effective way to setup and operate a call centre.”
Monday, October 26, 2009
New Xbox: Full body and gesture recognition
Microsoft's Project Natal is the first game console without a controller. The system has a 3D camera that maps the exacts position of your hands, your fingers, your feet, your header, your nose, everything in a 3D map. This allows you to control the game with only your body, in great detail, and no controller needed. Furthermore, it recognises voice and faces and supports complex video chat.
Steven Spielberg: . "This is a pivotal moment that will carry with it a wave of change, the ripples of which will reach far beyond video games"
Sunday, October 25, 2009
PPC Strategies, Twitter Interview, SEO Plugin- Weekly Wra...
Recap of activity:
- 4 Ways to Make Money with Pay Per Click Search Engines
- Twitter Marketing Expert Interview with Warren Whitlock
- 2 More Great SEO Plug-ins for Firefox
- Discussing how you can maximize the use of Facebook, LinkedIn, Twitter and other social media profile pages to maximize exposure and extend your marketing reach, getting started consideratios for PPC Campaign Mangement, and referencing an expert interview with a social media user with 8700+ LinkedIn Connections.
Sunday, October 18, 2009
Future Designer laptop - ROLLTOP, Diploma Thesis
The device of the flexible display allows a new concept in notebook design growing out of the traditional book formed laptop into unfurling and convolving portable computer.
By virtue of the OLED-Display technology and a multi touch screen the utility of a laptop computer with its weight of a mini-notebook and screen size of 13 inch easily transforms into the graphics tablet, which with its 17-inch flat screen can be also used as a primary monitor.
On top of everything else all computer utilities from power supply through the holding belt to an interactive pen are integrated in Roll top. This is really an all-in-one gadget.
VMware: five biggest challenges of server virtualization
Although the benefits of virtualizing x86 servers have been pushed relentlessly for the past five years or so, much less discussed have been the challenges involved in moving to a world where resources are pooled and everything is linked.
The complexity that such a scenario generates can have a knock-on effect on issues ranging from infrastructure and licensing to skills, which means that migrating to the new environment can end up being an expensive upfront proposition.
Adrian Polley, chief executive at IT services provider Plan-Net, says, "You are often talking about a complete change in infrastructure, which is why people who started on this path before the recession may have continued, but not many have plunged in since."
A key challenge is that virtualization involves sharing resources, whether that relates to hosts, storage or networks, but changing one element of the whole can have repercussions elsewhere.
"All of this sharing means that if you give to one thing, you take away from something else, so it becomes a balancing act to understand how resources should be properly allocated," Polley says. "There are always bottlenecks and you can end up just moving them around. Because things are so interconnected, you can end up chasing your tail."
As a result, we have come up with a guide to help you work your way through the mire. Below we look at five of the biggest challenges relating to x86 server virtualization and what you can do about them.
1. Network connections
"If the network is not up to snuff, you are in trouble from the start. But the bad thing is that, if you have virtualized your servers without doing your homework, you will not know whether it is the network that is to blame for performance issues or something else," says Dan Hidlebaugh, network server manager at Hertford Regional College.
The educational establishment virtualized its x86 servers about two years ago in a bid to cut escalating utility bills, reduce its carbon footprint and improve its disaster recovery provision.
A campus-wide agreement with Microsoft meant that licensing fees were lower than those of rival vendors. So it agreed to become a European test site for the supplier's Hyper-V offering, helped by IBM, which provided the college with a free six-month trial of its BladeCenters. The organization has now consolidated its 120 physical servers down to about 55 virtual servers and expects more to follow.
But Hidlebaugh warns that the success of such projects is not just dependent on ensuring that the virtualization software works effectively.
"You have to look at what hardware you want to use, the storage area network (San), how you connect the two, how they connect to the network, how the network reaches the end-user, etc," he says. "You can have a great virtualization platform, but if clients cannot access it due to a network bottleneck, it is useless."
The college had already decided to upgrade its network as part of a planned move to new premises and undertook a thorough review. As a result, it introduced an enterprise-class Cisco router, a dual-band wireless network and 10Gbit network-to-edge switches to connect the system to users in each classroom. Twelve core fiber cables were also laid for redundancy purposes and the network was tested "mercilessly" for a month to push it to its limits.
Another performance consideration, however, related to the communications backplane of the host.
"We had to ensure that the servers' backplane could handle the same speeds as the router. If you just throw memory and processing power at it but are stuck with a 1Gbit network connection, you will end up with big performance issues," says Hidlebaugh. The BladeCenters in question have a backplane of 700Gbits.
2. Network storage
A further concern when going down the virtualization route relates to storage. Hypervisor suppliers generally recommend implementing network storage such as Sans for larger production deployments, particularly if organisations are keen to deploy high-availability tools such as VMware's VMotion. Direct attached storage may suffice for smaller development and test environments, however.
VMotion enables the automatic migration of workloads between different servers should one crash or need to be taken down for maintenance. But this activity requires that virtual machines be stored as disc images in the San. Each host on the network needs to be able to see each disc image to understand when and where to assign spare processing capacity should it be required.
But Sans - and personnel with the appropriate skills - are expensive to acquire, especially if organizations opt for higher performance fiber channel-based systems rather than cheaper ISCSI equivalents.
Even if such a system is already in place, it may be necessary to upgrade it to ensure that performance is adequate and that all components are certified to run in a virtualized environment, which is not always the case. Checking suppliers' hardware compatibility lists is a must, as is following configuration recommendations.
3. Sizing storage capacity
Another must is to size the San adequately, not least to guard against wasting money by over-provisioning the system. Such a consideration is also important in light of the fact that some organizations find their applications run more slowly in the wake of a virtualization implementation, despite their use of server-based memory management techniques such as page sharing.
Hidlebaugh says, "Disc issues tend to be the problem." The challenge in this context is that virtual machines generate a high number of I/O requests to be processed each second, but the San's physical discs may be unable to keep up.
One way of getting around the problem is to use workload analysis and planning tools such as Novell's Platespin. These tools evaluate what level of capacity is likely to be required for a virtualized environment based on the profile of current physical servers in terms of memory, disc, processor and network bandwidth usage.
An array that supports mixed workloads can also help. I/O-intensive applications such as databases and high-throughput software, such as backup, all appear as a single big workload to the array despite their different requirements.
But because priority is given to processing big blocks of data, smaller I/O-based sequential transactions are generally made to wait, which negatively affects their performance. A system able to handle both kinds of workloads simultaneously can help to address the issue, however.
4. Back-up challenges
Many organizations continue to back up their virtualized server environments in the same way as their physical servers, but this approach has its downsides. A key challenge relates to the fact that such activity in a physical environment is often undertaken by software agents that are installed on host operating systems and back up both applications and data to either disc or tape.
The problem with doing things this way in a virtual world is that virtual machines consist of complete logical environments that include not just the applications and data, but also the VM file system. Because traditional software does not back up the VM file system, should the virtual machine go down, it is necessary to rebuild the file system from scratch. The system must then be restored and configured and the relevant data and applications copied over to run on it.
Northern Ireland-based car dealership company Isaac Agnew was unhappy with the time-consuming nature of this process and so introduced specialist back-up tools from Veeam.
The organization initially virtualized a Dell blade server in the latter half of 2007 to try the technology out, but is now running 20 virtual machines on three VMware ESX-based machines used mainly for development and test purposes.
Tim Carter, senior systems administrator at Isaac Agnew, says, "Before Veeam, we had scripts that one of the team had written to automatically copy some files from the virtual machine onto one of the servers that was backed up periodically using CommVault. But we had to manually choose what to synchronise into the back-up folder, and if we missed something, we were in trouble."
Backing up each virtual machine would have meant purchasing a back-up license for each machine on which they ran, which was considered too expensive.
But the snapshotting capabilities of the new tools now mean that, "We can restore the file system in a minute as opposed to hours of rebuilding the virtual machine and copying files, which often resulted in staffing having to do overtime in the evenings and weekends," Carter says.
Although more storage capacity is needed to back up virtual machines in this way, the compression functionality provided by the tools mitigates this requirement nicely, he adds.
5. Application support
Although most applications will run in a virtualized environment, obtaining full support is another matter. There will be no problem with packages that are certified as "virtualization-ready", but some suppliers are unwilling to commit themselves to this approach either because they have not fully tested their software on virtualized hosts, or because their applications have already run into some kind of problem in the virtualized environment.
Other companies offer a kind of half-way house service in that users will be requested to reproduce any faults on a physical server if it is suspected that the issue is associated with the move to virtualization.
As a result, Hertford College's Hidlebaugh believes that it is necessary for organisations to go through "a whole process" to decide which applications are suitable candidates for migration and which are not.
"Suppliers of things like domain controllers told us that their applications were not proven yet and so to please wait. There are about 30 of our servers that we are not going to virtualize and about 10 of them relate to applications that have not been tested," he says.
"It is crucial to talk to your suppliers and anyone else who is supporting your applications," Hidlebaugh warns, otherwise you could end up putting yourself at risk.
He would also be wary about virtualizing I/O-intensive applications such as Hertford College's Microsoft SQL Server databases and Exchange e-mail servers without heavy amounts of testing due to San-related performance issues.
Skills
The knock-on effects of moving to a world where everything is interconnected do not end here. Another important thing to think about is skills, particularly in large enterprises, where IT staff tend to specialize in key functional areas such as storage, servers and networking.
Because all of these areas begin to overlap in the virtualized world, it is easy to end up in a scenario where support is duplicated in some areas but falls through the gaps in others. It is crucial to clearly delineate roles and decide on who is responsible for what. It may also be necessary to train personnel across the IT department in new disciplines.
Plan-Net's Polley says, "The skills issue is hard to overstate because people end up having to have a much greater breadth of knowledge. They really do need to be expert in a bunch of areas if they are going to solve problems in a virtualized world successfully."
The complexity that such a scenario generates can have a knock-on effect on issues ranging from infrastructure and licensing to skills, which means that migrating to the new environment can end up being an expensive upfront proposition.
Adrian Polley, chief executive at IT services provider Plan-Net, says, "You are often talking about a complete change in infrastructure, which is why people who started on this path before the recession may have continued, but not many have plunged in since."
A key challenge is that virtualization involves sharing resources, whether that relates to hosts, storage or networks, but changing one element of the whole can have repercussions elsewhere.
"All of this sharing means that if you give to one thing, you take away from something else, so it becomes a balancing act to understand how resources should be properly allocated," Polley says. "There are always bottlenecks and you can end up just moving them around. Because things are so interconnected, you can end up chasing your tail."
As a result, we have come up with a guide to help you work your way through the mire. Below we look at five of the biggest challenges relating to x86 server virtualization and what you can do about them.
1. Network connections
"If the network is not up to snuff, you are in trouble from the start. But the bad thing is that, if you have virtualized your servers without doing your homework, you will not know whether it is the network that is to blame for performance issues or something else," says Dan Hidlebaugh, network server manager at Hertford Regional College.
The educational establishment virtualized its x86 servers about two years ago in a bid to cut escalating utility bills, reduce its carbon footprint and improve its disaster recovery provision.
A campus-wide agreement with Microsoft meant that licensing fees were lower than those of rival vendors. So it agreed to become a European test site for the supplier's Hyper-V offering, helped by IBM, which provided the college with a free six-month trial of its BladeCenters. The organization has now consolidated its 120 physical servers down to about 55 virtual servers and expects more to follow.
But Hidlebaugh warns that the success of such projects is not just dependent on ensuring that the virtualization software works effectively.
"You have to look at what hardware you want to use, the storage area network (San), how you connect the two, how they connect to the network, how the network reaches the end-user, etc," he says. "You can have a great virtualization platform, but if clients cannot access it due to a network bottleneck, it is useless."
The college had already decided to upgrade its network as part of a planned move to new premises and undertook a thorough review. As a result, it introduced an enterprise-class Cisco router, a dual-band wireless network and 10Gbit network-to-edge switches to connect the system to users in each classroom. Twelve core fiber cables were also laid for redundancy purposes and the network was tested "mercilessly" for a month to push it to its limits.
Another performance consideration, however, related to the communications backplane of the host.
"We had to ensure that the servers' backplane could handle the same speeds as the router. If you just throw memory and processing power at it but are stuck with a 1Gbit network connection, you will end up with big performance issues," says Hidlebaugh. The BladeCenters in question have a backplane of 700Gbits.
2. Network storage
A further concern when going down the virtualization route relates to storage. Hypervisor suppliers generally recommend implementing network storage such as Sans for larger production deployments, particularly if organisations are keen to deploy high-availability tools such as VMware's VMotion. Direct attached storage may suffice for smaller development and test environments, however.
VMotion enables the automatic migration of workloads between different servers should one crash or need to be taken down for maintenance. But this activity requires that virtual machines be stored as disc images in the San. Each host on the network needs to be able to see each disc image to understand when and where to assign spare processing capacity should it be required.
But Sans - and personnel with the appropriate skills - are expensive to acquire, especially if organizations opt for higher performance fiber channel-based systems rather than cheaper ISCSI equivalents.
Even if such a system is already in place, it may be necessary to upgrade it to ensure that performance is adequate and that all components are certified to run in a virtualized environment, which is not always the case. Checking suppliers' hardware compatibility lists is a must, as is following configuration recommendations.
3. Sizing storage capacity
Another must is to size the San adequately, not least to guard against wasting money by over-provisioning the system. Such a consideration is also important in light of the fact that some organizations find their applications run more slowly in the wake of a virtualization implementation, despite their use of server-based memory management techniques such as page sharing.
Hidlebaugh says, "Disc issues tend to be the problem." The challenge in this context is that virtual machines generate a high number of I/O requests to be processed each second, but the San's physical discs may be unable to keep up.
One way of getting around the problem is to use workload analysis and planning tools such as Novell's Platespin. These tools evaluate what level of capacity is likely to be required for a virtualized environment based on the profile of current physical servers in terms of memory, disc, processor and network bandwidth usage.
An array that supports mixed workloads can also help. I/O-intensive applications such as databases and high-throughput software, such as backup, all appear as a single big workload to the array despite their different requirements.
But because priority is given to processing big blocks of data, smaller I/O-based sequential transactions are generally made to wait, which negatively affects their performance. A system able to handle both kinds of workloads simultaneously can help to address the issue, however.
4. Back-up challenges
Many organizations continue to back up their virtualized server environments in the same way as their physical servers, but this approach has its downsides. A key challenge relates to the fact that such activity in a physical environment is often undertaken by software agents that are installed on host operating systems and back up both applications and data to either disc or tape.
The problem with doing things this way in a virtual world is that virtual machines consist of complete logical environments that include not just the applications and data, but also the VM file system. Because traditional software does not back up the VM file system, should the virtual machine go down, it is necessary to rebuild the file system from scratch. The system must then be restored and configured and the relevant data and applications copied over to run on it.
Northern Ireland-based car dealership company Isaac Agnew was unhappy with the time-consuming nature of this process and so introduced specialist back-up tools from Veeam.
The organization initially virtualized a Dell blade server in the latter half of 2007 to try the technology out, but is now running 20 virtual machines on three VMware ESX-based machines used mainly for development and test purposes.
Tim Carter, senior systems administrator at Isaac Agnew, says, "Before Veeam, we had scripts that one of the team had written to automatically copy some files from the virtual machine onto one of the servers that was backed up periodically using CommVault. But we had to manually choose what to synchronise into the back-up folder, and if we missed something, we were in trouble."
Backing up each virtual machine would have meant purchasing a back-up license for each machine on which they ran, which was considered too expensive.
But the snapshotting capabilities of the new tools now mean that, "We can restore the file system in a minute as opposed to hours of rebuilding the virtual machine and copying files, which often resulted in staffing having to do overtime in the evenings and weekends," Carter says.
Although more storage capacity is needed to back up virtual machines in this way, the compression functionality provided by the tools mitigates this requirement nicely, he adds.
5. Application support
Although most applications will run in a virtualized environment, obtaining full support is another matter. There will be no problem with packages that are certified as "virtualization-ready", but some suppliers are unwilling to commit themselves to this approach either because they have not fully tested their software on virtualized hosts, or because their applications have already run into some kind of problem in the virtualized environment.
Other companies offer a kind of half-way house service in that users will be requested to reproduce any faults on a physical server if it is suspected that the issue is associated with the move to virtualization.
As a result, Hertford College's Hidlebaugh believes that it is necessary for organisations to go through "a whole process" to decide which applications are suitable candidates for migration and which are not.
"Suppliers of things like domain controllers told us that their applications were not proven yet and so to please wait. There are about 30 of our servers that we are not going to virtualize and about 10 of them relate to applications that have not been tested," he says.
"It is crucial to talk to your suppliers and anyone else who is supporting your applications," Hidlebaugh warns, otherwise you could end up putting yourself at risk.
He would also be wary about virtualizing I/O-intensive applications such as Hertford College's Microsoft SQL Server databases and Exchange e-mail servers without heavy amounts of testing due to San-related performance issues.
Skills
The knock-on effects of moving to a world where everything is interconnected do not end here. Another important thing to think about is skills, particularly in large enterprises, where IT staff tend to specialize in key functional areas such as storage, servers and networking.
Because all of these areas begin to overlap in the virtualized world, it is easy to end up in a scenario where support is duplicated in some areas but falls through the gaps in others. It is crucial to clearly delineate roles and decide on who is responsible for what. It may also be necessary to train personnel across the IT department in new disciplines.
Plan-Net's Polley says, "The skills issue is hard to overstate because people end up having to have a much greater breadth of knowledge. They really do need to be expert in a bunch of areas if they are going to solve problems in a virtualized world successfully."
'Endpoint' business security market to generate over $16bn by 2014, says research
The market for protecting business laptops, desktops and mobile devices with anti virus, encryption and firewall software will reach $16.4bn by 2014 according to a new security report by Juniper Research. By that date network-based security provision to the business "endpoint" will account for close to 20% of the market, says Juniper.
Despite the ongoing move towards so-called "endpoint solutions", where IT security staff can manage the security on a company's desktops and other devices remotely through the network, demand for business IT security products will be driven by the need for encryption products, as devices become both more mobile and more likely to carry sensitive company information.
"From desktop to the mobile device, the encryption function will play an increasingly important role as both governments and businesses realize the vulnerability of data on corporate machines and the importance of data protection," says Juniper analyst Anthony Cox.
Data protection legislation covering personal information on company phones and other office hardware will lend further support to the encryption market, particularly in the US, Europe and Japan. Indeed, though Endpoint and Unified Threat Management (UTM) options have been successfully sold to the market it is still difficult to provide all the elements of a business security solution in one package, particularly if mobile devices are to be protected as well.
Other findings from the report:
* Overall business security market (excluding mobile) will be worth $15.7bn by 2014
* Encryption is set to increase at a rate of 26% to $4.3bn in 2014, largely on the back of legislation governing data protection.
The report looks at how the mobile and desktop business IT security market will develop and in which directions. The report forecasts both the number of corporate users protected as well as revenues for six areas of security: Basic antivirus and firewall security; Encryption and advanced authorisation functions; Advanced firewalls, Intrusion detection and prevention, Virtual private networks (VPN); Security policy management, endpoint security updates and provision; Security of corporate mobile devices.
Despite the ongoing move towards so-called "endpoint solutions", where IT security staff can manage the security on a company's desktops and other devices remotely through the network, demand for business IT security products will be driven by the need for encryption products, as devices become both more mobile and more likely to carry sensitive company information.
"From desktop to the mobile device, the encryption function will play an increasingly important role as both governments and businesses realize the vulnerability of data on corporate machines and the importance of data protection," says Juniper analyst Anthony Cox.
Data protection legislation covering personal information on company phones and other office hardware will lend further support to the encryption market, particularly in the US, Europe and Japan. Indeed, though Endpoint and Unified Threat Management (UTM) options have been successfully sold to the market it is still difficult to provide all the elements of a business security solution in one package, particularly if mobile devices are to be protected as well.
Other findings from the report:
* Overall business security market (excluding mobile) will be worth $15.7bn by 2014
* Encryption is set to increase at a rate of 26% to $4.3bn in 2014, largely on the back of legislation governing data protection.
The report looks at how the mobile and desktop business IT security market will develop and in which directions. The report forecasts both the number of corporate users protected as well as revenues for six areas of security: Basic antivirus and firewall security; Encryption and advanced authorisation functions; Advanced firewalls, Intrusion detection and prevention, Virtual private networks (VPN); Security policy management, endpoint security updates and provision; Security of corporate mobile devices.
Friday, October 16, 2009
Apple Makes It Easier for Free iPhone Apps to Make Money
Apple said Thursday that it will let iPhone application developers offer their users the option to buy additional content or features within a free app on its App Store.
App developers said they received an email notice from Apple informing them that the in-app purchase feature was now available for free apps and that it would “simplify your development by creating a single version of your app that uses in App Purchase to unlock additional functionality, eliminating the need to create Lite versions of your app.” A spokeswoman for Apple confirmed the news.
The in-app purchase feature, which was first introduced in March, allows developers to offer fresh content for purchase within an app such as new levels in a game, additional books in an e-book app, or expanded capability in productivity apps. The caveat, however, was that the feature was only available for paid apps, which meant that developers had to charge at least 99 cents.
Developers say that the latest announcement helps in two ways. First, it makes it much easier for them to make a business out of free apps. Until now, developers sold ads within their free apps or tried to convert users to a paid version with more content.
“The reality is that the vast majority of apps have been free. If you were going to monetize your app, you were always going up against free apps,” said Jamie Berger, senior vice president of an IGN Entertainment division that provides digital distribution services to developers.
The new capability could also help clean up the App Store because it would make it less necessary to offer both a free “lite” version and a paid version of the same app, a strategy that many developers used to try to make money.
“This is really big news because we’ve been having conversations with Apple as have other publishers,” said Clive Downie, vice president of marketing at Ngmoco, which publishes iPhone games. “This enables increased choice for customers.”
Saturday, October 10, 2009
ITU TELECOM WORLD
Geneva, 5-9 October 2009.
Only held every few years, ITU TELECOM WORLD is an unrivaled event for the global telecommunication and information communication technology (ICT) sector. Forward-looking, WORLD 2009 attracts all stakeholders from across the sector and around the world for a truly global, world-class event and networking platform.
ITU is the leading United Nations agency for information and communication technologies. As the global focal point for governments and the private sector, ITU's role in helping the world communicate spans 3 core sectors: radiocommunication, standardization and development. ITU also organizes TELECOM events and was the lead organizing agency of the World Summit on the Information Society.
ITU is based in Geneva, Switzerland, and its membership includes 191 Member States and more than 700 Sector Members and Associates.
Only held every few years, ITU TELECOM WORLD is an unrivaled event for the global telecommunication and information communication technology (ICT) sector. Forward-looking, WORLD 2009 attracts all stakeholders from across the sector and around the world for a truly global, world-class event and networking platform.
ITU is the leading United Nations agency for information and communication technologies. As the global focal point for governments and the private sector, ITU's role in helping the world communicate spans 3 core sectors: radiocommunication, standardization and development. ITU also organizes TELECOM events and was the lead organizing agency of the World Summit on the Information Society.
ITU is based in Geneva, Switzerland, and its membership includes 191 Member States and more than 700 Sector Members and Associates.
Wednesday, October 7, 2009
ID card officials back away from scandal-hit database
Government plans to store ID card biometrics data on a controversial system used by thousands of public workers might be scrapped.
The Home Office has confirmed it is reconsidering plans to use the Customer Information System system to store biometric data for the ID card scheme.
The Customer Information System (CIS) - which is run by the Department for Work and Pensions (DWP) - has yet to meet the Cabinet Office's latest standards on IT security, we have learned.
As revealed in August that thirty four council staff accessed the CIS database to snoop on the personal records of celebrities and acquaintances. Nine of the council workers were sacked.
The CIS database holds information on 85 million citizens, and is the government's main citizen database. It is available to 140,000 users from eight government departments, and to 445 local authorities.
But it is proving difficult for the Department of Work of Pensions to allow thousands of public workers and local authorities to access the CIS Oracle-based database, yet keep it demonstrably secure.
The Home Office revealed plans to use the CIS system for ID cards in December 2006 in its Strategic Action Plan for the National Identity Scheme.
In the Strategic Action Plan for the National Identity Scheme, the Home Office said: "We plan to use DWP's Customer Information System (CIS) technology, subject to the successful completion of technical feasibility work," for National Identity Register biographical information.
It added: "DWP's CIS technology is already used to hold records for everyone who has a National Insurance number - i.e. nearly everyone in the UK."
The Home Office planned to separate DWP's citizen data on the CIS information from the biometrics store being built up on the National Identity Register.
Now the government plans to avoid using CIS for the ID card scheme, if possible. A spokesman for the Home Office said using CIS is no more than an option for the future.
He said the possibility of using CIS will not be considered until the system has full security accreditation, which is due in 2010 at the earliest.
The Home Office will store biometric information for ID cards on a database run by Thales, one of the main contractors for the ID card scheme.
Officials had planned to use CIS for the ID card scheme to save money. It would have allowed the government to avoid building an entirely new system and security architecture.
But we have learned that the security of the CIS has been so discredited that officials are keen to distance the ID card scheme from it, even if this means paying for a new system from scratch.
Labels:
Biometrics Data,
ID Card,
The Home Office,
UK
Sunday, October 4, 2009
MotionX-GPS Drive: low cost iPhone navigation
Californian start-up Fullpower Technologies, who has been quite successful with its GPS outdoor and sport iPhone applications has launched a few days ago its own turn-by-turn navigation software on the iTunes store.
MotionX-GPS Drive is an off-board application based on NAVTEQ map data and deCarta's geospatial platform which also offers Bing Local search (Microsoft) and traffic information, but no text-to-speech to pronounce street names.
Selling at $2.99 for one month, or $24.99 for one year, this application is to date the lowest price point for turn-by-turn navigation on the App Store. In comparison Gokivo (NIM) and AT&T Navigator (TeleNav) cost $9.99 per month and on-board systems (with lifetime license) span from $34.99 (ALK Technologies) to $99 (TomTom).
According to Wall Street Journal tech guru Walt Mossberg, who tested MotionX-GPS Drive prior to its release: “This app worked well in my tests, and is packed with features, including live traffic, a route summary, and integrated music control. It understood my D.C. test address, but it doesn’t announce street names, and its function buttons are very small and labeled with tiny type.” Not a bad review for a first navigation application.
Top questions:
Q1. "In what countries can I use MotionX-GPS Drive for navigation?"
A1. Currently, the US and Canada are supported. Watch MotionX.com for future releases with more functionality.
Labels:
Advertorial,
Apple,
iPhone 3G S,
MotionX,
Video
Saturday, September 26, 2009
7 Manieren om toch je targets te halen
1 Focus op uw beste verkoopopportunities
Beoordeel alle verkoopopportunities die u in de pijplijn hebt zitten en selecteer de opportunities die u redelijkerwijze nog voor 5 december 2009 kunt afsluiten.
2 Definieer uw unique selling points
U hebt een unieke organisatie!
3 Pak uw bellijst van driekwart jaar geleden er weer eens bij Hoe staat het nu met de projecten die toen werden uitgesteld?
4 Stel een commercieel actieplan op
Alles moet anders het laatste kwartaal
5 Zoek contact met de beslissers bij de klant
Blijf niet hangen bij mensen die geen beslissingen mogen nemen
6 Stel een goed verzorgde laatstekwartaalaanbieding samen Schakel hiervoor professionals in
7 Verberg uw eigen doel ‘ik móet verkopen’
Los niet uw eigen problemen op, maar die van de klant
Beoordeel alle verkoopopportunities die u in de pijplijn hebt zitten en selecteer de opportunities die u redelijkerwijze nog voor 5 december 2009 kunt afsluiten.
2 Definieer uw unique selling points
U hebt een unieke organisatie!
3 Pak uw bellijst van driekwart jaar geleden er weer eens bij Hoe staat het nu met de projecten die toen werden uitgesteld?
4 Stel een commercieel actieplan op
Alles moet anders het laatste kwartaal
5 Zoek contact met de beslissers bij de klant
Blijf niet hangen bij mensen die geen beslissingen mogen nemen
6 Stel een goed verzorgde laatstekwartaalaanbieding samen Schakel hiervoor professionals in
7 Verberg uw eigen doel ‘ik móet verkopen’
Los niet uw eigen problemen op, maar die van de klant
Thursday, September 17, 2009
Sunday, September 13, 2009
Mobile Cloud Computing Subscribers to Total Nearly One Billion by 2014
Experts say that the number of mobile cloud computing subscribers worldwide will grow rapidly over the next five years, rising from 42.8 million subscribers in 2008, (approximately 1.1% of all mobile subscribers) to just over 998 million in 2014 (nearly 19%). Mobile cloud applications move the computing power and data storage away from mobile phones and into the cloud, bringing apps and mobile computing to not just smartphone users but a much broader range of mobile subscribers.
According to senior analists; “From 2008 through 2010, subscriber numbers will be driven by location-enabled services, particularly navigation and map applications. A total of 60% of the mobile Cloud application subscribers worldwide will use an application enabled by location during these years.”
Some quite innovative applications are already commercially available. Lock manufacturer Schlage, or example, has launched LiNK – a keyless lock system for the home that enables subscribers to remotely control not only the door lock, but heating/cooling, security cameras and light monitors, all via PC or mobile device.
Business productivity applications will soon dominate the mix of mobile cloud applications, particularly collaborative document sharing, scheduling, and sales force management apps.
Experts expect some or all of the major PaaS platforms — Google, Amazon AWS, and Force.com – to market their mobile capabilities aggressively starting in 2010.
Experts conclude by reiterating their findings that; “By 2014, mobile cloud computing will become the leading mobile application development and deployment strategy, displacing today’s native and downloadable mobile applications.”
Netbooks reshape the PC industry
- Telcos across EMEA and APAC rush to sell 3G netbooks
* 13.5 million netbooks were sold worldwide in H1 2009
* More than 50 telcos have begun selling netbooks
* HP has the most telco deals overall, but Samsung has risen quickly with the NC10
* Many established PC vendors have moved too slowly, including Sony, Toshiba, Fujitsu and Lenovo
* Netbooks are three times as likely as notebooks to be used in public places
* €100-€199 is the sweet spot for subsidy-driven netbooks H1 2009 research highlights
The PC industry is undergoing a more dramatic transformation than seen at any time in the last 15 years. The netbook category was invented as recently as 18 months ago by the likes of Asus and Acer and is the only PC segment enjoying growth this year. The impact of netbooks has been profound. It has forced Microsoft to fend off a threat from Linux by reducing its operating system prices and to continue promoting its aging XP brand. Netbooks have dramatically lowered industry price points, attracting new categories of consumer buyers. Furthermore, hard-pressed PC vendors have been forced to cut their operating costs to have any chance of turning a profit. The biggest change of all has been the success the telcos have had in selling subsidised 3G netbooks, emulating the mobile phone business model. The market shares of PC vendors are changing rapidly on the back of their willingness to commit to the netbook category and their agility in chasing these new, substantial telco deals.
A research analyst, said, “Our latest research reveals that, in August, across Europe, the four PC vendors with the most telco deals were Samsung, Asus, HP and Acer. The real surprise has been how quickly the Korean vendors have moved to leverage their mobile phone businesses, selling netbooks to telcos – LG’s netbooks have become prominent in the major countries too. Samsung has achieved great reviews for its NC10 netbook, primarily because of its keyboard and extended battery life. Suddenly, Samsung is a force to be reckoned with in the PC industry – it already has deals with more than half of the telcos currently selling netbooks. Meanwhile, traditional notebook category leaders – including Lenovo, Fujitsu, Sony and Toshiba – have been slow to recognise how quickly the market is changing around them and as a group they have signed fewer than 10 operators.”
As well as being an important sales channel, the operators are playing a pivotal marketing role for the netbook. The telcos have massive retail and marketing coverage, so suddenly netbooks are being promoted in newspapers, billboards and storefronts with a prominence never before given to PCs. Vendors that are not present in the telco channel are missing out on valuable promotional opportunities. Netbooks and smart phones are finally justifying the telcos’ massive infrastructure investments in mobile broadband. They are seeing data revenues rise quickly to offset falling prices for their voice services, and the business case for investments in LTE are starting to look more promising.
Research suggests that many netbooks are being sold as additional devices, rather than as replacements for notebook PCs. A survey of over 3,000 European consumers during August 2009 revealed that netbooks were three times as likely as notebooks to be used in cafés, public parks or on trains. More than 45% of netbook owners said that they took the device on vacation with them.
“The telco channel took around six months longer to develop in APAC than in EMEA, but activity has accelerated, especially in North Asia. We observed more than double the number of netbook deals in telcos in August as we did in June. The local vendors are moving fast in their home countries, so Asus and Acer lead in Taiwan, Samsung and LG have the deals in Korea, while Lenovo, Haier and Tsinghua Tongfang are active in China. Sony, Toshiba and Sharp have all arranged deals in Japan. HP’s superior coverage gives it the most deals overall across the APAC region. We expect to see a rush of new deals across South East Asia and Oceania toward the end of this year.”
In August, in both EMEA and APAC, the preferred range for subsidised netbook prices was €100-€199 ($145-$290). Monthly contracts are more common in Europe, whereas in APAC pre-pay is preferred, both through embedded 3G and dongle options. Consequently, the subsidies on offer from the telcos are around €60 ($97) higher in Europe than in Asia.
Apple has resisted the temptation to follow other PC vendors into the telcos, despite the fact that its phenomenally successful iPhone has given it these relationships. Its selling proposition and price points for the Mac fit better with its own Apple retail stores, its stores-within-stores, and its Premium Resellers and other partners. Nokia, the smart phone leader, has, on the other hand, moved quickly to launch its Windows-based Booklet. It has the best telco coverage of any vendor, but it will be a surprise if it can succeed with price points that are substantially higher than the competition.
Microsoft’s launch of Windows 7 next month is likely to provide a further boost to the PC market, the consumer side of which has held up surprisingly well during the summer months. Expect the distinctions between smart phones, netbooks and notebooks to become increasingly unclear over the next year as the screen sizes and performance of netbooks increase while new operating systems and processors are launched. Mobile devices, in all their different forms, have been the bright points within the technology industry in this difficult year.
TOP TEN NETBOOKS; KLICK HERE
* 13.5 million netbooks were sold worldwide in H1 2009
* More than 50 telcos have begun selling netbooks
* HP has the most telco deals overall, but Samsung has risen quickly with the NC10
* Many established PC vendors have moved too slowly, including Sony, Toshiba, Fujitsu and Lenovo
* Netbooks are three times as likely as notebooks to be used in public places
* €100-€199 is the sweet spot for subsidy-driven netbooks H1 2009 research highlights
The PC industry is undergoing a more dramatic transformation than seen at any time in the last 15 years. The netbook category was invented as recently as 18 months ago by the likes of Asus and Acer and is the only PC segment enjoying growth this year. The impact of netbooks has been profound. It has forced Microsoft to fend off a threat from Linux by reducing its operating system prices and to continue promoting its aging XP brand. Netbooks have dramatically lowered industry price points, attracting new categories of consumer buyers. Furthermore, hard-pressed PC vendors have been forced to cut their operating costs to have any chance of turning a profit. The biggest change of all has been the success the telcos have had in selling subsidised 3G netbooks, emulating the mobile phone business model. The market shares of PC vendors are changing rapidly on the back of their willingness to commit to the netbook category and their agility in chasing these new, substantial telco deals.
A research analyst, said, “Our latest research reveals that, in August, across Europe, the four PC vendors with the most telco deals were Samsung, Asus, HP and Acer. The real surprise has been how quickly the Korean vendors have moved to leverage their mobile phone businesses, selling netbooks to telcos – LG’s netbooks have become prominent in the major countries too. Samsung has achieved great reviews for its NC10 netbook, primarily because of its keyboard and extended battery life. Suddenly, Samsung is a force to be reckoned with in the PC industry – it already has deals with more than half of the telcos currently selling netbooks. Meanwhile, traditional notebook category leaders – including Lenovo, Fujitsu, Sony and Toshiba – have been slow to recognise how quickly the market is changing around them and as a group they have signed fewer than 10 operators.”
As well as being an important sales channel, the operators are playing a pivotal marketing role for the netbook. The telcos have massive retail and marketing coverage, so suddenly netbooks are being promoted in newspapers, billboards and storefronts with a prominence never before given to PCs. Vendors that are not present in the telco channel are missing out on valuable promotional opportunities. Netbooks and smart phones are finally justifying the telcos’ massive infrastructure investments in mobile broadband. They are seeing data revenues rise quickly to offset falling prices for their voice services, and the business case for investments in LTE are starting to look more promising.
Research suggests that many netbooks are being sold as additional devices, rather than as replacements for notebook PCs. A survey of over 3,000 European consumers during August 2009 revealed that netbooks were three times as likely as notebooks to be used in cafés, public parks or on trains. More than 45% of netbook owners said that they took the device on vacation with them.
“The telco channel took around six months longer to develop in APAC than in EMEA, but activity has accelerated, especially in North Asia. We observed more than double the number of netbook deals in telcos in August as we did in June. The local vendors are moving fast in their home countries, so Asus and Acer lead in Taiwan, Samsung and LG have the deals in Korea, while Lenovo, Haier and Tsinghua Tongfang are active in China. Sony, Toshiba and Sharp have all arranged deals in Japan. HP’s superior coverage gives it the most deals overall across the APAC region. We expect to see a rush of new deals across South East Asia and Oceania toward the end of this year.”
In August, in both EMEA and APAC, the preferred range for subsidised netbook prices was €100-€199 ($145-$290). Monthly contracts are more common in Europe, whereas in APAC pre-pay is preferred, both through embedded 3G and dongle options. Consequently, the subsidies on offer from the telcos are around €60 ($97) higher in Europe than in Asia.
Apple has resisted the temptation to follow other PC vendors into the telcos, despite the fact that its phenomenally successful iPhone has given it these relationships. Its selling proposition and price points for the Mac fit better with its own Apple retail stores, its stores-within-stores, and its Premium Resellers and other partners. Nokia, the smart phone leader, has, on the other hand, moved quickly to launch its Windows-based Booklet. It has the best telco coverage of any vendor, but it will be a surprise if it can succeed with price points that are substantially higher than the competition.
Microsoft’s launch of Windows 7 next month is likely to provide a further boost to the PC market, the consumer side of which has held up surprisingly well during the summer months. Expect the distinctions between smart phones, netbooks and notebooks to become increasingly unclear over the next year as the screen sizes and performance of netbooks increase while new operating systems and processors are launched. Mobile devices, in all their different forms, have been the bright points within the technology industry in this difficult year.
TOP TEN NETBOOKS; KLICK HERE
Thursday, September 10, 2009
Jobs Takes Stage at Apple Event
Apple's chief executive officer gratefully accepts applause as he returns to the stage at Wednesday's company event after a lengthy absence due to a liver transplant. (Sept. 9)
Mr. Jobs, taking the stage at an event in San Francisco, unveiled new offerings that included an iPod Nano with a video camera. Apple also dropped prices across its iPod lineup as the company tries to revive slowing sales.
Mr. Jobs, who had not been seen publicly since an October event, was dressed in his usual black turtleneck and jeans. The 54-year-old appeared thin and spoke with a scratchy voice, but showed energy and enthusiasm.
"I'm very happy to be here with you all," said Mr. Jobs as he received a standing ovation. He explained that he had received the liver of a young adult who died in a car accident. "I wouldn't be here without such generosity," he said, urging others to become organ donors, too.
Apple's changes to iTunes, including the social-networking features, are the biggest in years, and the iPod nano's video camera is a shot across Cisco's bow, Walt Mossberg of The Wall Street Journal tells Stacey Delo.
Apple's CEO and co-founder returned to his post in late June, following a nearly six-month medical leave. Mr. Jobs, who has battled pancreatic cancer, worried investors last year by exhibiting noticeable weight loss. He bowed out of his usual keynote at the Macworld trade show in January and went on leave.
"He looked thin but much better than he had a year ago. Part of the reason was to show the crowd he's alive and kicking," said Charlie Wolf, an analyst at Needham & Co.
Apple showed off new iTunes software and iPods with lower starting prices amid slowing sales and increasing competition from companies like Microsoft Corp., which recently announced a new version of its Zune HD music player.
The iPod is still the dominant digital music player, with nearly 74% market share, according to Apple. But it has been eclipsed by the fast-selling iPhone. In the quarter ended June 27, iPod revenue fell 11% from a year earlier to $1.49 billion.
Apple lowered the starting price of its iPod Touch device, which is essentially an iPhone without cellular phone capability, to $199 from $229. The new iPod Nano, starting at $149, comes with an FM receiver and pedometer in addition to the built-in video camera.
"They're just trying to segment the product line, and they're trying to get people to buy multiple iPods," said Gene Munster, an industry analyst with Piper Jaffray & Co. He noted that Apple didn't add a camera in the iPod Touch as had been widely expected.
The company stressed the success of games on the iPod Touch and the iPhone, compared with devices like Sony Corp.'s PlayStation Portable and Nintendo Co.'s DS.
"When these things came up they seemed so cool...but they don't really stack up anymore," said Apple marketing chief Philip Schiller.
Apple also unveiled a new version of its iTunes software and online store. Among the new features: greater ability to share music and other digital content between multiple computers in a single home and a feature called iTunes LP, which brings additional content like lyrics, videos and artwork to albums purchased on the site.
Tuesday, September 8, 2009
Home fibre plans survive downturn
The benefits of fibre to the home go beyond speed
More than two million people in Europe now have fibre broadband direct to their home, suggests a survey.
The latest figures on superfast broadband delivered by fibre to the home (FTTH) shows 18% growth over the last survey compiled in late 2008.
The continued growth suggests that the global economic downturn has not hit plans to build a fibre infrastructure.
Sweden tops the list of nations rolling out the technology, with 10.9% of its broadband customers using fibre.
Karel Helsen, president of Europe's Fibre-To-The-Home Council, said the growth matched predictions that were revised when the credit crunch started to make itself felt.
TOP FIBRE NATIONS
1) Sweden - 10.9%
2) Norway - 10.2%
3) Slovenia - 8.9%
4) Andorra - 6.6%
5) Denmark- 5.7%
6) Iceland - 5.6%
7) Lithuania - 3.3%
8) Netherlands - 2.5%
9) Slovakia - 2.5%
10) Finland - 2.4%
"The numbers in 2009 are in line with the latest forecasts," said Mr Helsen.
By 2012, the FTTH Council expects that 13 million people across 35 European nations will have their broadband delivered by fibre. Such services would start at speeds of 100 megabits per second (mbps), said Mr Helsen.
Around Europe more than 233 projects were underway to lay the fibres that would connect homes or buildings to the net, said Mr Helsen. Many of those, he said, were being operated by local governments or smaller net firms.
Local governments were interested in FTTH because of the economic and social benefits it brought in its wake, said Mr Helsen.
The low latency or delay inherent in high-speed fibre networks made possible novel uses of broadband, he said.
"No delay is very important," he said, "specifically if you talk about applications that are time dependent such as personal communications, conference calls or video calls where delays cause a lot of interference."
While early FTTH services were concentrated in cities, said Mr Helsen, many more were reaching out to rural areas for e-health and e-learning projects.
Separate studies show that an FTTH infrastructure can have a direct impact on local economic output, said Mr Helsen.
The UK, France and Germany have yet to break into the list of top ten FTTH nations.
More than two million people in Europe now have fibre broadband direct to their home, suggests a survey.
The latest figures on superfast broadband delivered by fibre to the home (FTTH) shows 18% growth over the last survey compiled in late 2008.
The continued growth suggests that the global economic downturn has not hit plans to build a fibre infrastructure.
Sweden tops the list of nations rolling out the technology, with 10.9% of its broadband customers using fibre.
Karel Helsen, president of Europe's Fibre-To-The-Home Council, said the growth matched predictions that were revised when the credit crunch started to make itself felt.
TOP FIBRE NATIONS
1) Sweden - 10.9%
2) Norway - 10.2%
3) Slovenia - 8.9%
4) Andorra - 6.6%
5) Denmark- 5.7%
6) Iceland - 5.6%
7) Lithuania - 3.3%
8) Netherlands - 2.5%
9) Slovakia - 2.5%
10) Finland - 2.4%
"The numbers in 2009 are in line with the latest forecasts," said Mr Helsen.
By 2012, the FTTH Council expects that 13 million people across 35 European nations will have their broadband delivered by fibre. Such services would start at speeds of 100 megabits per second (mbps), said Mr Helsen.
Around Europe more than 233 projects were underway to lay the fibres that would connect homes or buildings to the net, said Mr Helsen. Many of those, he said, were being operated by local governments or smaller net firms.
Local governments were interested in FTTH because of the economic and social benefits it brought in its wake, said Mr Helsen.
The low latency or delay inherent in high-speed fibre networks made possible novel uses of broadband, he said.
"No delay is very important," he said, "specifically if you talk about applications that are time dependent such as personal communications, conference calls or video calls where delays cause a lot of interference."
While early FTTH services were concentrated in cities, said Mr Helsen, many more were reaching out to rural areas for e-health and e-learning projects.
Separate studies show that an FTTH infrastructure can have a direct impact on local economic output, said Mr Helsen.
The UK, France and Germany have yet to break into the list of top ten FTTH nations.
Subscribe to:
Posts (Atom)