About Me
- Roy Illsley
- A Senior Research Analayst for a leading firm, with a focus on infrastructure management and virtualisation
Tuesday, 4 November 2008
Moving from Old to New
I believe that many organisations in the next few years will reach a tipping point where they must make the leap and implement new technology, which could create problems for other older technologies; therefore, I think a time will come when organisations think about complete replacement using Infrastructure as a Service (IaaS) approach, which will be the beginning of the cloud revolution.
However, do not worry that time is a few years away yet, and in fact may not happen for 10 years, but like the Internet, it is coming so do not bury your head in the sand, be prepared and start planning now for a brave new IT world.
Monday, 3 November 2008
Clouds
Over recent weeks a number of high profile announcements about cloud initiatives have circulated, and at first sight this may suggest that cloud computing has broken through from concept to reality in the enterprise, but is this just more hype or are we at the dawn of a new era in computing?
Firstly, the concept of cloud is not universally categorized so all the rhetoric needs to be carefully evaluated; I describe ‘cloud computing’ as the ability to deliver IT as a collection of services to a wide range of customers over the Web. We further refine this definition as either internal – IT within an organization’s firewall making its resources available as a cloud to its customers – or external – where a service provider supplies IT capability to customers via the Internet as either a top-up to existing IT resources, or as a complete solution thereby making the server-less organization a reality.
One of these announcements was that IBM is investing US$300M in 13 new data centres world-wide aimed at providing Disaster Recovery (DR) capabilities. This new initiative was described as a cloud computing solution for DR; it provides backups of data on servers that can then be quickly accessed to rapidly restore lost files. This solution can be seen as IBM leveraging its acquisition in 2007 of Arsenal Digital Solutions – a manufacturer of rack-mounted appliances dedicated to business continuity.
However, the IBM announcement uses the term cloud, but does not really deliver a cloud solution; it merely offers a single cloud-based service which I believe represents the current state of the market: that is to say single cloud solutions aimed at particular niche deployments. In July HP, Intel, and Yahoo announced that they are working together to deploy a global test-bed of six data centres for the open development and testing of solutions to the challenges cloud computing will present. This announcement is a larger scale than the Google and IBM announcement of October 2007, where two data centres dedicated to cloud research were being set-up.
I consider the concept of cloud computing to represent the future of how IT will be delivered to its customers, but we believe that many issues remain with cloud computing and applaud the efforts of HP, Intel, Yahoo, IBM, Google, and Microsoft for providing the platforms for developers and researchers to work on the challenges. One of the most fundamental challenges to be how the services and delivery will be managed, and more importantly charged for. This is just one example of the sort of practical questions that come to mind when you start to consider how the cloud concept can be used.
I expect many more announcements of cloud solutions like the IBM DR one will be made over the coming years. This we believe will create confusion in the market similar to that when virtualization first appeared, but as the research turns to solutions the scope of these announcements will increase from single solutions to more enterprise-ready solutions; however, the marketing hype may have already created a high-level of scepticism among end-user organizations that will need to be convinced that the cloud has arrived and is fit for purpose in commercial deployments.
Monday, 6 October 2008
V IS THE WORD
For many years now the analysts have been calling vmware an OS vendor, but it refused to accept the label arguing that it considered its self a virtualisation vendor, and was not a direct competitor of Microsoft’s, rather a complementary technology. However, the announcement of the VDC-OS demonstrates a move towards a vmainframe computer principal. Effectively, what VDC-OS will enable is for large resource pools to be created from a collection of commodity based hardware, which can consist of storage, servers, or network devices. These large computing resource pools it is argued will provide the scalability needed by organisations to execute a collection of applications that provide a business service as a single Virtual Machine (VM).
The VDC-OS is a framework that has three main components, the interface to the hardware is called vInfrastructure Services, and includes vCompute, vStorage, and vNetwork, which are all capabilities designed to abstract the resources so they can be pooled. The second component is Application vServices, which addresses the changing nature of an applications relationship to an OS. The final component is the management layer, and vmware have renamed Virtual Centre (VC) to vCentre. The VDC-OS represents the evolution of vmware’s Virtual Infrastructure (VI) solution, and it is anticipated that by early 2009 many of the capabilities required to make VDC-OS a reality will be available.
The vCloud initiative consists of three different approaches, the first is a program for service providers that will enable them to construct cloud solutions that can be offered to its customers, secondly it will eventually be a product that enterprises can purchase to construct its own internal cloud, and finally a set of APIs that will allow the on-demand allocation of resources between internal and external clouds to operate. However, I consider that before vCloud becomes widely adopted a number of key issues need to be addressed, not least of which is details on how the services will be licensed and managed.
The last major thread of vmware’s future plans is a shift in emphasis away from server virtualisation towards desktop or client virtualisation with its vClient initiative. The objective is to enable the end-user to connect from any device and receive their personal desktop environment. The main component of this plan is the development of a client side Hypervisor that will be a bare metal Hypervisor and control the protocols used so that the end-user experience is delivered according to what device and what connectivity the user is using.
I believe that vmware has been very bold in making clear how it see’s its future, but has to wonder how much of Paul Maritz key note was aimed at Wall Street and allaying the fears of its investors about future revenue streams; this road map of how it plans to grow the business and deal with the threat of Microsoft’s entry into the market provides that audience with what it needs, but is the technology mature enough to support this radical shift in the data centre.
Thursday, 7 August 2008
How Green is your PC
This war of marketing, or as I prefer it recycled packaging, began in 2006 when Dell announced a plant a tree for me campaign, and promptly proclaimed itself the environmental warrior of the IT community. The Dell announcement of plant a tree for me said that US$6 per desktop would fund tree planting to cover the carbon emissions the PC would generate over a typical three year life span. I calculated that using an eight hour day and 200 days operational use a year, and providing the PC is switched off when not used in the evening and weekends, then this would equate to approximately 4800 hours of use over the three years. I then used the results from the UK governments Defra report (Defra’s greenhouse gas (GHG) conversion factors for company reporting), which suggests that one should assume an average of 0.43 kg of CO2 emissions per kWh of electrical power consumption, to calculate that a standard 220 Watt PC would emit 454kg of CO2 emissions over the three years, which appears a lot for just US$6 to off set.
This announcement (plant a tree for me) irked HP who have been following a Corporate Social and Environmental Responsible (CSER) agenda, which for many years has embedded Global Citizenship as one of the seven core elements in its corporate objectives. It is also worth noting that HP has been re-cycling products since 1987; it developed the Designed for Environment (DfE) policy in 1992; and entered in to a joint initiative with the World Wildlife Fund US (WWF-US) to reduce its greenhouse gas emissions from its operating facilities worldwide in 2006.
However, the recent announcement from Dell stated it had extended its leadership in global recycling, announcing it is ahead of schedule to achieve a multi-year goal of recovering 125 million kilograms (about 275 million pounds) of computer equipment by 2009.
Not to be out done HP this week announced it recycled nearly 250 million pounds of hardware and print cartridges globally in its fiscal year 2007 – an increase of approximately 50 percent over the previous year and the equivalent of more than double the weight of the Titanic.
Therefore, as we can see this war of words, and deeds, looks set to continue, and as people become more familiar with the concept of a carbon footprint then expect this to become even more personal in how it is improving your life.
I consider that although this posturing appears targeted at convincing the consumer that the vendor has valid ‘green credentials’, and they are not contributing to the problem of green house gases, but are in fact part of the solution, it is at least driving the environmental debate and forcing other vendors to follow, which can not be a bad thing. Particularly as we are likely to witness an increased demand for even small organisations to report on its carbon emissions, and if you consider that in the example above the PC over the three years if it was not switched off at weekends and the evening would generate a total of 2500kg in CO2. Therefore, any approach to raise the awareness must be applauded if it is helping us to reduce our own carbon footprint.
Wednesday, 23 July 2008
Virtualisation do not chose for today consider tommorow
The rest of week will be working on my presentation for VMworld in September, so if you are going I will see you there.
Virtualisation Market Hots Up
The whole market in virtualisation is dynamic and as such represents a daunting prospect for many CIOs, who must decide which vendor I go with, and what are the gotchas that the technology has hidden.
My advice is to not look beyond a 3-5 year time frame, as the market is changing rapidly, and a leading vendor today may not be then, or the technology will have changed. You do not want to make a 100% life time choice of Betamax, it was the best technology, but VHS won the battle, so beware consider the value you can get, and build in the cost model a complete replacement of the solution in 5 years. Then review the figures and see if they make sense
Friday, 11 July 2008
Vmware changes at the top
Vmware ditch the Green agenda
Vmware announced this week that they have voted Dianne Green out as CEO and president, to be replaced by an ex Microsoft and EMC SVP. The key question is why and what does this mean for virtualisation and Vmware in particular.
My view is that Dianne, as nice as she was, was destined to be moved out because Vmware have become increasingly isolated in the virtualisation market, or to be exact they are loosing the marketing war on interoperability of virtualisation. I like most others assumed Dianne would be given time to show how Vmware was going to react to the Microsoft Hyper-V entry to the market.
I guess the vote indicates that the board did not believe her approach would address the issue of increased competition in the market. Her replacement, being from EMC, and an ex Microsoft executive, is an interesting choice, and indicates that EMC is taking a more hands-on approach to Vmware than is visible.
I would expect Vmware to start to be more vocal about its partnerships, and begin to build more open links with the likes of Citrix, Microsoft and others. The virtualisation market is still in a state of flux, and just because Vmware is dominant today, does not mean it will be in three years time. To maintain its lead Vmware must re-invent it’s self, and be the champion of interoperability between hypervisors. By doing this, it will increase the potential market size, and therefore increase its share of the revenues.
An area that remains potentially very fertile is that of desktop virtualisation, in this space Vmware has made some strides, but with Citrix having a massive install base of terminal services customers, Vmware must work hard to build on its brand name.
These are I believe interesting times for Vmware, they have every thing to gain, and every thing to lose, so must walk a certain line if they are to remain the face of virtualisation. What now for Dianne, well I would expect her and some colleagues to begin a new start-up in an adjacent market, and try to once again become a dominant figure on wall street.
Monday, 7 July 2008
A Damp day in Hull
CTO or CIO who holds the power
The role of IT is changing and it is moving towards a period of transition to a position where IT is embedded in the organisation, and is managed more locally by the people it is designed to help, but still needs the holistic cross departmental perspective. This transition I believe will be the catalyst for the clarification of the roles of CIO and CTO, which is currently not clearly defined (in fact the organisational structure is very haphazard and examples of the CTO reporting to CIO and visa versa are common). We believe that the office of the CTO should include the architecture, strategy, research and development, and planning operations, while the CIO should be responsible for the delivery of IT as a service to their customers as efficiently and effectively as possible, and be focused on the extraction of business value from the IT resources. The CIO will need to have a voice in the office of the CTO so that operational considerations are taken in to account when designing the architecture in the future.
However, given that the two roles are addressing different needs, and therefore have different agendas, we believe that the current arrangement (as haphazard as it is) is not sustainable in the long term; we consider the two roles need to be separated and not be part of the same department: this would enable the forward looking strategic decisions to be made taking all aspects of organisational needs in to account (IT, people, culture, money and market forces), and operational effectiveness be the prime consideration of the CIO, while the CTO is more focused on the technology and the architecture in particular.
Monday, 30 June 2008
Home sweet home
Working on a report is fun; you get to research a topic in depth, build up a detailed knowledge of a topic, and discover what has been happening in the market. The down side is most of this is done from home, and as such you do not get out and meet people, hence not much to say.
The world in a URL
The opening up of the dot name space, so you can have dot anything (up to 64 characters long) will be a nightmare for brand managers: do they protect all the brands, all the tag lines, and all the possible derivations, which could be thousands, or do they just register the brand name.
This is a risky business, as the brand reputation could be high jacked and destroyed on-line, the alternative is managing and funding all the domain names that could be used to link to the brand. Glad I am not a brand manager.
Wednesday, 25 June 2008
Chaos rules
Yesterday was a write off, I had the dentist, doctors for the kids jabs, British Gas doing an inspection, school. And the day was trashed because the nurse got up late, was behind with her work. That small thing made my whole day one of catch-up, and re-assign appointments. The Journalist were OK, and I made the vendor briefing, but I do not want another day like that.
Hyper-v worth the Hype?
I thought the 2nd August was 180 days for when Hyper-v will be released, but I am picking up noises that suggest it is next month, but I may be wrong on that. Hyper-v is a basic hypervisor, and as such lacks some of the more advanced features that Vmware, Citrix (XENserver), VirtualIron, etc have. I believe that the link up with Citrix demonstrates that Microsoft is going after the SMB sector with Hyper-v and leaving XenServer to complete in the enterprise market with VMware, while it works on making Hyper-v as technically capable as its rivals that is.
Vmware with it price bundles is attempting to move into the SMB space, however, what Vmware provides in terms of capability it lacks is a clear understanding of the market, and how to deliver to the SMB sector. Smaller vendors such as VirtualIron and Parallels have created a good reputation in certain SMB markets, but they lack the funding to raise virtualisation profile. Therefore, I believe that as Microsoft winds up its PR message, this can be used by the smaller vendors to ride on the coat-tails and enjoy more success.
Windows server 2008, is a very good product and I think this will become more widely used as the business case evidence is released to support Microsoft's claims of reduced management time and hence cost savings. As for Vista, well 2008 and Vista desktop is an argument from a support perspective, but with talk of Microsoft seven (vista replacement) due in 2009 time frame I think many may hold fire (if they can), which will mean organisations if their refresh is due in 2009, 2010 will have a dilemma use Vista or stay on XP. Evidence is mixed on this, but I believe Vista will more widely adopted in conjunction with 2008, but not in every case.
Monday, 23 June 2008
Only 6 weeks until the football season starts
Virtualisation aims at the desktop next
The take-away from the Citrix and Vmware analyst events was they have both woken up to the fact that what the analysts have been saying for the past 12 months plus is coming true. Server virtualisation was full of gotcha’s, and end-users found them out and management vendors were not ready. The result was it stalled and damaged confidence.
However, the rise of desktop and application virtualisation is characterised by other vendors being ahead of the game, and the virtualisation vendors being slow to recognise its value. We will have to wait and see how this battle shapes up, as Citrix have a leading position and it is theirs to lose, but their market is based on the old paradigm, not the new. Therefore, I see this as a more level playing field where the best proposition wins out, and so far Citrix has a good story and Vmware some cool technology. What is needed is the mix of both.
Wednesday, 18 June 2008
A Damp day in Munich
I will get home at 11pm, and then off to Vmware tomorrow for another overnight stop.
Think about image management in a desktop virtualisation world
The big topic that Vmware and other VDI vendors neglect to say is that managing the images and then the patches is an operation that must not be under-estimated, Citrix have a different approach, and it certainly resonates with hard pressed server managers.
Consider the how do I manage and how do I provision and patch the centrally hosted desktop images and applications. This is an area that should be considered, as the value you can obtain from both centrally hosted and managed images, as well as a method of image management provides a double cost saving.
Tuesday, 17 June 2008
BHX and a coffee, with the internet on the go
Windows Server 2008 points the way to Microsoft’s future approach
Windows Server 2008 is a surprisingly diverse product, with some very small new enhancements that could be easily overlooked, and some large high profile additions that Microsoft is certainly not allowing anybody, including the media, to overlook. However, what is the balanced view on Windows Server 2008.
Firstly, it will have by 2 August an inbuilt Hypervisor – a Hypervisor enables the virtualisation of the commodity server hardware so that it can support the execution of multiple Virtual Machines. Hyper-V, as it is known, is not the most technically advanced Hypervisor on the market, that award goes to Vmware, but it is a very good basic Hypervisor with a couple of interesting features: The ability to execute a Xen based Virtual Machine, and the concept of synthetic device drivers – the synthetic device drivers are the new high performance device drivers that are available with Hyper-V, rather than emulating an existing hardware device Microsoft exposes a new hardware device that has been designed for optimal performance in a virtualised environment.
One of the smaller and easier to overlook features of Windows Server 2008 is the ability to have a finer grained password policy, which may sound dull, but consider the IT department that is supporting ‘C’-level executives who do not necessarily have the time, or inclination, to maintain a complex alphanumeric 10 character password that is forced to be changed every 30 days. This finer control allows for these users to have different rules to say a database administrator, which enables IT to ensure that password policies are designed appropriately for the role/purpose of the account.
The other big feature of Windows Server 2008 is the introduction of server core, a stripped down operating system. This according to Microsoft requires up to 40% less patches to be applied, and occupies significantly less disk space than for the full Windows Server 2008. I consider this to be a major advancement, which will enable organisations to install server core on systems such file and print servers, reducing the maintenance required, and hence the operational cost.
Other features that are worthy of a mention at this stage include; role-based installation of features, simplified clustering using the wizard concept, read-only domain controllers, modified boot process that brings the firewall up earlier and so reduces the window of vulnerability, and the use of Network Access Protection (NAP) so a health policy can be set for anything connected to the network.
I consider that unlike its code base cousin, Windows Vista, Windows Server 2008 actually provides the system administrators with the capabilities needed to make their operational lives easier. I are not predicting a massive up-take for Windows Server 2008 this year, but I believe that as organisations plan to refresh its technology Windows Server 2008 will be selected because it has been designed to make management simpler, and hence reduce operational costs.
Monday, 16 June 2008
Monday morning i feel fine, i have work on my mind?
Today has four telephone sessions, one with the team, two vendor briefings, and one journalist, then I will finish my last out-line for the report in between.
Virtual Management on the network
An aspect of virtual management that is often overlooked is that of network management, and how this becomes more complex in a virtual network world. Why, because the in a virtual world you are not restricted by the physical connections, so potentially you could create recursive loops, and instead of having network separation, have complete openness.
Friday, 13 June 2008
Friday, Friday the day we all adore, except fish that is
I have a call with an end-user today on the report to discuss how they have approached IT strategy, all good background for the report. Do not forget to complete the survey please.
It’s the Network no it’s the Application
Sounds like a playground squabble, well that life in most silo’ed IT departments. This situation has given rise to a new generation of monitoring tools, such as Netscout, they are passive none intrusive, using either port mirroring or passive tapping techniques.
The benefit is that these tools can monitor the packet level transmissions, this level of information is important to understand normal behaviour. The problem with tools has been that they generate vast amounts of data, and it took time to analyse it and make it relevant.
Today, these tools can provide sub-second real-time data, and present this in different ways to match the audience. Therefore, it is not surprising that many telco’s are using these tools to ensure quality of service.
Beware, the benefits are great, but ask what effort is required to set-up and administer the tool once operational, that is where you will find the good tool from the bad.
Thursday, 12 June 2008
Typical British climate, hot one day, wet and cold the next
On a wet dull day working from home when nobody is here does take the shine off it, but the work output will increase so it is swings and round-abouts.
EA or IT Strategy which came first
One question that appears to be the cause of much debate on the discussion forums and twitter is what role EA plays in an IT strategy. The proponents argue that it either defines the strategy, or is defined by the strategy.
However, if you examine the process of strategy as it relates to IT, then it becomes clear that EA is a component part in this process. That is to say, it works in conjunction with the strategy development process, and does not define the strategy, or is defined by the strategy. The role of EA is to translate the business strategic imperative into an IT architecture, an IT strategy considers more than just the architecture, as organisations, and strategies, are constructed from a combination of people, process and technology.
Therefore, neither came first, they are twins hatched from the same egg.
Wednesday, 11 June 2008
Webinars save the planet, but do not meet the speakers needs
Anyway today is day two of the report, more out-lines to produce and get approved.
The Virtual Desktop will it happen
The death of the desktop PC has been predicted by many in our industry for years, but we still have millions of them in use all over the world, and with the cost of the hardware falling the economics look like keeping the status quo.
However, as the world goes more mobile, and the devices used require less local processing power the need for smaller intelligent devices that can connect to the data sources and applications will increase. This will be the point that organisations look at the value of the desktop in terms of how does it fit my business model.
Tuesday, 10 June 2008
Home Home in the office
The good thing is that I am in charge of my own work load, so if it is a hot and sunny day, I may well be in the garden, and catch-up when it is wet and windy (we are talking about a British summer here).
Tripwire, cool and useful – not something you hear everyday
In the world of server virtualisation one of the biggest challenges is the problem of ensuring all the virtual machines are operating according to a strict set of policies. Tripwire have developed, in conjunction with VMware, a Hyper-V solution will be available shortly, a free downloadable tool that will allow organisation’s to set policies and to test how the current environment conform to them.
I am a believer in the fact that one signal of a maturing market is when the eco-system includes free-ware, and many tools to help solve problems that only become apparent when technologies are used in live production situations. Virtualisation is rapidly moving in this direction, although it still has some way to go before you could class it as a mature technology.
Monday, 9 June 2008
Office day
I.S. Strategy
The purpose of an I.S. strategy is to define the direction that I.T is taking in support of the business objectives. I am working on a six step model for the stages to be addressed in the production of a strategy, which should be completed by the end of July.
To start this discussion what is the role of a mission statement. I believe that this is stage one, and should be about telling different audiences what I.T does, and why it does it. This is as much about motivating the team as it is about telling the business what your role is.
Friday, 6 June 2008
At home with no cat
Discovered a new on-line site yesterday Snapfish.com, I will check it out over the weekend.
HP Labs a proper report
Last week HP made major announcements about the structure of its Labs, and the focus of its research. The number of commercial organisations that still retain a pure research capability, as opposed to a development capability, is limited to a handful of the leading vendors. The challenge for these organisations is to ensure that pure research can be funded, but is still compatible with an era where the focus is on value and reducing costs.
HP spends approximately US$3.6B on R&D with nearly 30,000 employees working in business units, but of this pure research not connected to the business unit’s accounts for about 5%, which is a budget of US$150M, and 600 researchers. HP Labs is a corporate function with the remit to look at new innovations that HP can take from concept to product in the long term.
Thursday, 5 June 2008
HP Labs, a great place for a good idea
The early start was not what I wanted, but the day was excellent, plenty of good debate, and plenty of stuff that would blow your mind.
HP Labs
HP have taken a practical and different approach to research, they have designed and built a governance model that helps them decide what to research. The work is designed with the environment in mind and HP are showing a leadership approach to how to do blue sky research in a commercial organisation, and show it is valuable.
Wednesday, 4 June 2008
Sunny day in London, but a cloud in my heart
I was positive until I saw the three or four duffers that scored me that way, but waiting here at Paddington stn I am sure I will bounce back.
WAN Optimisation.
Three aspects of managed services in this space to consider.
Keeping up with advancements in the technology and the network are the main reasons for using a managed service, it can also be cost effective in SMB or organisations that do not have the network skills in-house. Another reason for managed services is the ability to make the service subject to penalties so you can guarantee to users the level they need to operate the business, this obviously has the side-effect that you are not in control, but remain responsible for that level, the penalties would compensate you but may not help you keep your job.
Not every thing will benefit from WAN optimisation; it needs to demonstrate a clear increase in the service that is provided for the cost. For example VPNs do not benefit from WAN optimisation as much as some other traffic, so make sure you use the correct solution for the correct job. Also, data de-duplication is beginning to gain traction as, WAN optimisation will work, but if you are sending duplicate data you are not being as effective as you could be.
Not that my research has found, the managed services market is beginning to be seen as a good option for all industries, it is down to how much control the organisation wants/needs over the network, as a managed service is good but only as good as the contract you have and the provider.
Tuesday, 3 June 2008
Systems Management is set to get hot
Between now and lunch I am writing two pitches for the magazine, these are evaluated by the editorial committee, so if your pitch is not accepted you get to write somebody else’s, so that is motivation enough to put in good pitches.
ASG to make a significant acquisition?
ASG the ITSM and BSM vendor has been rumoured to be about to make a significant acquisition. A few years ago the rumour mill had ManagedObjects as a potential, but this time round the rumour mill is quiet, probably because it got it wrong last time.
If the rumours this time as true and they came from Mr Allen himself reportedly this time, then this will signify that the systems management market is becoming more competitive. If the rumoured acquisition is of a rival, i.e. little technology transfer then it can be taken as it is about market share and customer numbers; however, if it is a new technology then this should be to fill the current gaps in ASG’s solutions as compared to its rivals, such as IT Governance, Enterprise Asset Managagement, or automation. Either way it shows ASG is serious about breaking its label as the ‘best kept secret in systems management’ and certainly over here in Europe will become a serious contender to the likes of HP, IBM, CA, BMC, and now Microsoft.
Monday, 2 June 2008
IT Service Management coming out of its IT shell
I am also working on planning the Virtualisation events for the company, which means sorting out the content, picking the speakers and putting the agenda together, and writing the material the sales team can use to sell the events.
ITSM and the increasing role of BSM
The significance of ITIL® v3.0 is that it begins the process of externalising IT Service Management (ITSM), which it could be argued is what Business Service Management (BSM) has been doing since 2001. However, the difference is that although both are making the connection with critical business processes, ITIL® v3.0 has been developed over time in-line with the technology that can support and deliver its capabilities. BSM on the other hand was developed as a concept that although sound lacked the technology to drive increased adoption.
Today, the growth of x86 virtualisation is a catalyst for both concepts to be able to deliver real business benefit, the advantage that each has is clear:
ITIL® has strong connections with IT processes and procedures, whereas BSM has strong links to Business Process Modelling (BPM). The two disciplines are coming at the same problem from different aspects, but I believe BSM will be used for the business face, and ITIL® v3.0 for the IT face. That way combining them an organisation can exploit IT for business benefit.
Friday, 30 May 2008
Vista's replacement first sightings
New Microsoft Operating System
The first public news on Microsoft 7, as it is code named, was revealed on youtube. The replacement for Vista, which had very little new and compelling features, hence the slow up-take in the market, appears to be moving towards the Apple iPhone concept of touch screen, voice commands and jesters. However, Microsoft have been very secretive on this and very few people have been party to what they are thinking, but they need to get their skates on as the rumours suggest that it is due for release in the 2009/10 time frame.
Whatever, Microsoft 7 looks like the one thing it must do is provide the customer with the features that they want and do not yet know, they want and already know, and what they will need to compete in business successfully in the next five years. Therefore, it should be more like Server 2008, built to make out lives easier and increase productivity, security and collaboration.
Thursday, 29 May 2008
VMware continues to broaden its capabilities
The weather has picked up and we are looking set for a dry day, the first for about a week. This helps as I find walking and running through my presentation very helpful, so if you see some chap who appears to be talking to himself, just listen a little closer, it could be me practicing. The reason I do it, is it is better to speak out loud as you hear what you are saying, and it gets your vocal chords ready for projecting your voice.
VMware acquires another company
VMware announced it has entered into a definitive agreement to acquire B-hive Networks, Inc., a privately-held application performance management software company with headquarters in San Mateo, California and principal R&D facilities in Herzliya, Israel. With this acquisition, VMware will leverage the B-hive team and technology to offer proactive performance management and service level reporting for applications running within VMware virtual machines - on both servers and desktops. In addition, B-hive’s R&D facility and team will form the core of VMware’s new development center in Israel. The terms of the acquisition, which is expected to be completed during the third quarter of 2008, subject to customary closing conditions, were not disclosed.
This is just another move by Vmware as they expand its capability in the virtual market-place: because as the Hypervisor becomes more available and is less differentiated the value-add capabilities will be in the management space, but this requires a broad coverage and a new thinking on how organisations will get value from a virtualised data centre.
Wednesday, 28 May 2008
The rain comes from clouds, so is cloud computing a damp squid
What next for the Cloud
The concept of cloud computing is an interesting topic, and one which will no doubt take up many column inches over the coming years. For now I believe that its use will be restricted to internal cloud deployments: because the idea of letting critical business processes and data be supplied from a pool, that is available to all subscribers potentially, is not a prospect that many CIO will feel comfortable with.
However, the SMB sector will probably embrace the concept, as it will make starting and running a business less complicated, especially if you need IT and do not understand the market. At this stage my views are not yet well enough formed to produce an compelling argument either for or against the concept. I shall be visiting the HP labs next week and will be discussing clouds.
Tuesday, 27 May 2008
Virtualisation does not take bank holidays
Where are the big deployments of Desktop Virtualisation?
As an analyst I have been talking and writing about virtualisation for some time, and in fact desktop virtualisation, server hosted format that is, has been promising large benefits for over a year now. However, the uptake in the user community remains low, and this has prompted me to do some investigation.
The first issue appears to be that organisations are struggling to get the server side virtualisation projects working and bedded in, and this experience has made them more cautious than they were 12 months ago.
Secondly, the emotional attachment employees have to their desktop/laptop computer is proving to be harder to break than most observers first thought. This in its self will not stop any determined organisation, but does require some ground work to be conducted before these beloved items are removed.
Finally, the problems of getting power in to data centres is causing a re-think, because if all the desktop computing power is to be housed in the data centre then this will require significantly more space and power. Currently, data centres are finding it difficult to increase their power supply, and must first reduce consumption before they can accommodate this new centralised demand.
I believe that server hosted desktop virtualisation will become the de facto standard approach, but it will take another three – five years before it gains sufficient momentum to overtake the deployments of PCs.
Friday, 23 May 2008
EA is it the role for you
Enterprise Architecture (EA) an art or a science
For the bank holiday weekend I have decided to discuss the role of EA and in this blog concentrate on the capabilities needed to perform the role. In IT we believe everything can be decomposed to 1’s and 0’s: because that is the way the computer works, but EA is a discipline that attempts to cross the divide between the business world of £’s and NPV’s to that of the 1’s and 0’s.
Currently EA is in its embryonic state, by that I mean as a profession it is relatively immature, when compared to engineering where the status of Chartered Engineer (CEng) is recognised as a symbol of competence and ability presided over by a well structured standards organisation. It my contention that EA requires a similar rigorous body to standardise the skills and training needed to become an Enterprise Architect. Because only by raising the profile and making it recognisable and easy for employers to know that an individual has the skills to be an Enterprise Architect will the role become more organisational in its scope and therefore break-away from its current association with IT.
I believe that EA should be a discipline that reports to the CEO, and not CIO, as its purpose is to enable business strategy to be translated to executable programs that can deliver organisational value over the long-term
Thursday, 22 May 2008
HP is solid as i get over jet lag
HP has a solid quarter in EMEA
HP announced its quarterly earnings yesterday for EMEA and this can be reported as more solid that spectacular. However, that has to be set in the context of a very good last quarter. On the whole HP out-performed the market, and reported that is was seeing excellent growth in the emerging markets such as Russia and the Middle East.
Once again the trend in the Personal Systems Group (PSG) was that notebooks continue to grow, with revenues up 31% and units 46% demonstrating that prices are continuing to fall in a very competitive market. Desktop revenues were flat and units up only 2%. These figures indicate that HP is moving more units in the commercial sector and is getting this balance correct; both the commercial and consumer sectors grew 17% and 16% respectfully in terms of revenue. This balance enables HP to ensure it maintains a strong presence in the highly influential consumer sector, where I believe many innovations are being piloted by home users, then being translated to industrial use, for example Skype.
Technology Solutions Group (TSG) saw the software division grow by 28% and now represents 7% of this group’s revenue. The biggest highlight is that blades are continuing to grow and HP reported a near 60% market share in EMEA. Add to this the EDS acquisition then this group should continue to see good growth prospects.
Wednesday, 21 May 2008
Day 3 for Pulse, but home for me
Today is catch-up on sleep and general bits and pieces day, because if you are cleaver you use the extra hours going over to the US to do any work and get ahead of yourself so it can be an easy day.
Pulse is Green
IBM are beginning to drive home the message that if you want to make energy savings in the data centre you have got to be able to measure and control the devices at a granular level; however, the value is by taking this information and making it relevant to a non-technical audience by demonstrating which business services are consuming what and when.
IBM has a really good story to tell here, but they are struggling with how to market it, and for me the title ‘big green’ does not work. I applaud what they have and how it can be used for eventually even non-IT management, but think the potential audience is a currently confusing mix of IT and facilities teams. Therefore, until organisations resolve the internal responsibilities then IBM is trying to sell to two different groups with different agendas.
Tuesday, 20 May 2008
Day 2 of IBM Pulse
Day 2 in Orlando, the meal last nigh was vast, so I had to get up at 6 am and go to the gym. Good job as we have an early start 7.30 breakfast meeting, then 1 to 1’s, more sessions before I leave for the airport at 4pm. Today will be taking more notes and catching up on e-mails when we have finished the analyst session, and before the car comes to pick me up.
IBM Pulse, the focus is on the future
The approach being taken by IBM is setting them up for a future where the world is driven by management and the ability to control and automate. The strap-line is
Visualise,
Control,
Automate
You can automate what you can not control; you can not control what you can not see. These three strands are reflected in the way IBM has acquired and developed its products, IBM have taken this view to a wider perspective than just it, and see the inclusion of plant machinery as the next logical step, but the IP enabled devices that IPV6 will support represents the real future use.
Monday, 19 May 2008
Florida sunshine
IBM Pulse Service Management
The event has 4500 people and is the merger of three previous events, Netcool, Tivoli, and maximoworld. The message from IBM is that IT is the component that underpins every business process, and in fact impacts how and what we do today in terms of how organisations grow and become more profitable.
The message is very compelling; however, the tools and benefits all look great, but are the organisations ready for the shift that is needed to accommodate this new world. In my opinion some are, but most are not. Therefore, this vision will need the organisations to be guided through a transformation project and re-structuring. This is what IBM got into earlier, so as you can see this is a planned move, and looks like the correct strategy so that IBM becomes a leader in IT management in the next ten years.
Friday, 16 May 2008
Virtualisation can help with the green debate
The Green debate is still rumbling on
According to recent scientific reports there is now an added urgency for a more comprehensive international climate agreement post-2012. According to the most stringent scenario outlined by the Intergovernmental Panel on Climate Change (IPCC), the global average surface temperature can still be limited to an increase of 2 degrees C above the pre-industrial level. Staying within this limit means a reduction in global greenhouse gas emissions of at least 50% below the 1990 level by 2050.
Currently all the media attention is focused on the aviation and transportation sector as the villains, however, the IT data centre, and the computer in general, wastes a significant amount of energy every day. UK Government figures quote that an organisation can save UK£50 per PC per year in energy costs by simply ensuring PCs are switched off after work and at weekends. The data centre represents an even bigger prize for organisations that address the issue of under-utilisation of servers.
In fact rumours from the Parliamentary Renewable and Sustainable Energy Group (PRSEG) report that post 2012 the EU Emissions Trading Scheme will be extended, and this extension will inevitably affect more organisations than the current scheme. One possible approach will be for a carbon emissions cap on organisations, which will force organisations to look at its energy consumption, and I believe that IT can offer solutions that will enable a reduction in energy consumption. In fact I have been advocating the need for organisations to consider the power and cooling impact that IT data centres represent, and to adopt new technologies that can significantly reduce their energy consumption, and hence an organisation’s carbon footprint. I am a firm believer that virtualisation is one such technology that organisations should be actively investigating: because as the United Nations Climate Change Conference in Bali stated, action is needed now; therefore I recommend that organisations assess the impact that IT computing resources contribute to the overall organisational energy consumption, and that this should be addressed before legislation is introduced forcing organisations to report on and then reduce its energy consumption, or face penalty charges.
Thursday, 15 May 2008
The Services market get more competitive
HP and EDS a good marriage?
The purchase of EDS by HP has not been widely received as positive, which is due to the fact that HP has a reputation for taking a long time to integrate its mergers and acquisitions before they demonstrate stakeholder value. However, I believe in this case HP should be able to re-brand EDS and merge its own services function in to it, which will be a quicker approach and faster to return value.The key to why HP has taken this step can be seen in its solutions to data centre automation, these technologies when applied in services contracts will allow the services company to increase margin, by significantly reducing cost. It also provides HP with the ability to add the EDS data centres to its own very aggressive plans (currently well under way) to restructure the network and data centre configuration, which is yielding significant savings and providing HP with some more green credentials
Wednesday, 14 May 2008
What is Microsoft doing on the desktop
SIMtone, have sent me a trial of their Virtual PC beta software to test, the only problem being the device has a US plug, so I need to buy a US to UK convertor first, as my convertors are all the other way, for obvious reasons. So any spare time will be spent setting that up.
Windows XP Service Pack 3
Windows Vista has received a bad press from the media and us analysts alike, the truth is that the number of deployments is increasing slowly, in-line with that of XP when it was launched. However, Microsoft by releasing XP SP3 are sending a very confusing message; on the one hand if SP3 was just bug fixes and security updates, then no body would argue, but by including NAP (Network Access Protection) in the release they are adding new functionality to a product they want people to migrate from.
The launch of Windows Server 2008 I believe will be the catalyst for Windows Vista migrations to accelerate in the corporate world: as Server 2008 and Vista are the same code base, and they have been designed so that deploying Server 2008 will make managing the desktop Vista client much easier. This I believe will be the big driver for organisations when they consider upgrading their server operating systems, which combining the desktop as well can yield management savings. However, only time will tell, but I for one am not writing Vista off just yet, although I am still on XP for all my work and home PCs.
Tuesday, 13 May 2008
Virtualisation prices soften as competition grows
Vmware – Feeling the heat from Microsoft
As I predicted last year, and have been saying at speaking events for the pat 12 months, this year (2008) will see prices soften in the virtualisation space, especially from Vmware. Because the Hyper-V imminent launch (2nd August is latest date) will see the Microsoft marketing band-wagon push this technology to the SMB sector, which have been put off server virtualisation by the up-front cost of implementing Vmware, or the uncertainty of using open source. However, Citrix acquisition of XenSource, Oracle’s OVM (based on Xen) and Microsoft have changed the game. Vmware is now trying to build its reputation with the SMBs as a ready to use, cost effective solution to deploying added-value from server virtualisation.
Watch this space, as over the next 12 months Vmware will be creating a new position for its self as it differentiates its self from Microsoft in the virtualisation wars of 2009
Monday, 12 May 2008
A day in the office
The Vmware news will have to wait, it was given under NDA until today, US time, so I will update you tomorrow.
CMDBs Start small and expand as you go
CMDBs are an excellent tool, but beware when choosing one, ensure you can start small and at a high level of detail, then increase the granularity and scope as you become more familiar with the tool and more refined in your processes. ITSM is a good solid bedrock from which bigger and better things can grow. More to come over the coming months.
Friday, 9 May 2008
A world of virtual things
I also have only one vendor briefing today, VMware, which you will hear about on Monday. Have a nice weekend, and i have decided to write this blog in two parts, a diary for anybody interested, and an analyst comment on technologies i follow.
Virtualisation
The idea of infrastructure virtualisation is to increase the utilisation figures of servers and storage; however, it can also increase agility and delay capital expenditure, but server virtualisation will also need more storage, and increases your license costs. Therefore, before you leap do the maths, does it make sense for your organisation, or are there other solutions such as enterprise-class server or mainframes.
Thursday, 8 May 2008
Day 1 in the afternoon
Day 1
Server Virtualization Blog
Blog Archive
-
▼
2008
(40)
-
►
June
(16)
- Home sweet home
- Chaos rules
- Only 6 weeks until the football season starts
- A Damp day in Munich
- BHX and a coffee, with the internet on the go
- Monday morning i feel fine, i have work on my mind?
- Friday, Friday the day we all adore, except fish t...
- Typical British climate, hot one day, wet and cold...
- Webinars save the planet, but do not meet the spea...
- Home Home in the office
- Office day
- At home with no cat
- HP Labs, a great place for a good idea
- Sunny day in London, but a cloud in my heart
- Systems Management is set to get hot
- IT Service Management coming out of its IT shell
-
►
May
(17)
- Vista's replacement first sightings
- VMware continues to broaden its capabilities
- The rain comes from clouds, so is cloud computing ...
- Virtualisation does not take bank holidays
- EA is it the role for you
- HP is solid as i get over jet lag
- Day 3 for Pulse, but home for me
- Day 2 of IBM Pulse
- Florida sunshine
- Virtualisation can help with the green debate
- The Services market get more competitive
- What is Microsoft doing on the desktop
- Virtualisation prices soften as competition grows
- A day in the office
- A world of virtual things
- Day 1 in the afternoon
- Day 1
-
►
June
(16)