Advertisements

Venture Capital Jobs Blog

Curated by John Gannon and Team

Posts Tagged ‘cloud

Nice cloud computing taxonomy from the folks at Opencrowd

leave a comment »

White and Dark clouds clash, creating a thunde...
Image via Wikipedia

My buddies at Opencrowd recently published their Cloud Taxonomy.  They do a nice job of bucketing vendors by specific area of focus and help to unblur some of the cloud classification/nomenclature confusion.

If you think the taxonomy is useful, let Brad at Opencrowd know, I am sure he would be happy to hear from you.

Reblog this post [with Zemanta]
Advertisements

Written by John Gannon

April 3, 2009 at 4:52 pm

Posted in Uncategorized

Tagged with ,

Are Cloud Based Memory Architectures the Next Big Thing? | High Scalability

leave a comment »

DEC (Digital Equipment Corporation) VAX ECC me...
Image via Wikipedia

I’ve probably said this before, but the cloud is a new computing platform that some have learned to exploit, others are scrambling to master, but most people will see as nothing but a minor variation on what they’re already doing. This is not new. When time sharing as invented, the batch guys considered it as remote job entry, just a variation on batch. When departmental computing came along (VAXes, et al), the timesharing guys considered it nothing but timesharing on a smaller scale. When PCs and client/server computing came along, the departmental computing guys (i.e. DEC), considered PCs to be a special case of smart terminals. And when the Internet blew into town, the client server guys considered it as nothing more than a global scale LAN. So the batchguys are dead, the timesharing guys are dead, the departmental computing guys are dead, and the client server guys are dead. Notice a pattern?

via Are Cloud Based Memory Architectures the Next Big Thing? | High Scalability.

Reblog this post [with Zemanta]

Written by John Gannon

March 17, 2009 at 2:33 pm

Posted in Uncategorized

Tagged with ,

Cloud as a component

leave a comment »

Robin Harris just made a post about cloud storage as a service that was both insightful and right in line with some of my thinking about cloud infrastructure.

A car wash, haircut or a Google search is a service. You show up in your car, with your hair or a browser and a complete transaction occurs. A job completes.


The number of similar cloud services on the market suggests the wrong question is being answered. While there is a market for raw cloud storage – as Amazon’s S3 has shown – the real opportunity is incorporating it as a component – in a solution to a business problem.

I’m a big fan of companies who are leveraging cloud technology and economics to develop business-focused solutions that solve specific problems.

Although its certainly interesting for IT shops to be able to leverage a cloud infrastructure to scale servers on demand or access a theoretically infinite pool of storage, its more interesting and valuable to tie these cloud components together to solve business challenges like disaster recovery (for example).  Cloud components are ultimately going to become a commodity so I think if you’re planning to play in this space, play higher in the stack and use the components to build a solution that will get business stakeholders (versus IT stakeholders) excited to open their wallets.

Reblog this post [with Zemanta]

Written by John Gannon

December 26, 2008 at 8:26 pm

Posted in Uncategorized

Tagged with , ,

Hypothetical question for the IT guys in the room

with one comment

Assume your IT department runs all operating systems within virtual machines, and you have a fixed budget of $X to spend on a single new management software package.  Your boss tells you that you must purchase one of these packages (it’s year end and you will lose the budget if you don’t use it).

Your choices…

Package #1: This software allows you to manage operating system level and application level configurations in an elegant fashion.  You can assign roles to different servers and ensure that their configuration is always consistent with what you’ve specified in the management system.  Assume this tool has minimal awareness of what’s happening in the hypervisor.  It’s really focused on the guest operating systems.

Package #2: This software allows you to manage virtual machines in an elegant fashion.  Virtual machines can easily be created, tracked, deleted, and copied.  The software also allows you to develop and execute virtual machine workflow that make disaster recovery, test and development, etc much easier.  Basically, you are able to manage the full lifecycle of a virtual machine as well as automate tasks involving one or more virtual machines.  Assume this tool has minimal awareness of the guest operating systems.

Both packages cost $X, so you can only purchase one system.  Assume otherwise that these systems are equivalent in terms of reliability, ability to be customized, etc.

Which would package would you choose as either a) the CIO or b) the guy who is doing the hands-on administration work?

Feel free to state if you’re answering for a small, medium, or large IT shop as well.

Looking forward to seeing the responses…

Written by John Gannon

September 2, 2008 at 6:36 pm

Posted in Uncategorized

Tagged with

Cloudbursting and so much more

with 2 comments

The fine folks at Amazon recently posted to their Web Services blog about the idea of ‘cloudbursting’.  I thought it was an interesting post and touches on one use case for Amazon Web Services – as an overflow system to scale up your processing power on demand, even though you typically run your own datacenter or host servers in a colocation facility.

The post also discusses the concept of a hybrid model of computing, where some computing is done in-house, and some in the cloud.  As they said in the post, when they talk to folks about the idea of cloud computing, people tend to settle for a ‘middle of the road’ solution:

A typical audience contains a nice mix of wild-eyed enthusiasts and more conservative skeptics. The enthusiasts are ready to jump in to cloud computing with both feet, and start to make plans to move corporate assets and processes to the cloud as soon as possible. The conservative folks can appreciate the benefits of cloud computing, but would prefer to take a more careful and measured approach. When the enthusiasts and the skeptics are part of the same organization, they argue back and forth and often come up with an interesting hybrid approach.

The details vary, but a pattern is starting to emerge. The conservative side advocates keeping core business processes inside of the firewall. The enthusiasts want to run on the cloud. They argue back and forth for a while, and eventually settle on a really nice hybrid solution. In a nutshell, they plan to run the steady state business processing on existing systems, and then use the cloud for periodic or overflow processing.

I would propose there is another powerful use case for the hybrid model – the outsourcing of specific IT processes into the cloud.

This has been done in the business software world through SaaS, and now because of cloud computing solutions, it can be done in the world of IT infrastructure.   For example, there are some companies that are starting to do interesting stuff in this hybrid model beyond simply providing overflow computational capacity.

  • Simply Continuous is addressing the painful problem of business continuity by allowing customers to replicate their Wintel-based datacenters into their fully managed cloud.
  • Similarly, Skytap allows customers to create virtual software testing environments and pay by the drink instead of buying hardware and software to support those testing efforts.
  • 3tera recently announced a partnership with Nirvanix where Nirvanix’s storage cloud would be integrated with 3tera’s cloud computing management software.

Would love to hear about other examples that are out there.

Written by John Gannon

September 1, 2008 at 2:49 pm

Posted in Uncategorized

Tagged with ,

This is the most excited about a hard drive I’ve been in years

leave a comment »

The Amazon Web Services team just announced that they are now offering persistent raw disk-like storage to EC2 customers. Up until now, permanently storing files in Amazon’s cloud required you to use their S3 service (accessible via HTTP). Now, EC2 customers can access their data through a standard UNIX/Linux filesystem and be sure it will be there throughout the life of that instance.

Why is this important?

IT departments can begin managing (some) Amazon cloud configurations using legacy systems management tools and techniques.

A major hurdle to bringing in any new technology to an IT department is: “How will I manage and support this technology?” For cloud computing, the answer to date has been to roll your own management tools, or, work with one of the emerging vendors in the cloud computing management space. I would argue that those answers are a non-starter for most IT shops because they probably don’t have enough pain to warrant bringing in yet another management tool and the requisite investment in training, etc.  And rolling your own has its own set of issues, especially if your staff doesn’t have the coding and integration skills required.

If I can now manage system and application software in the cloud the way I do in my datacenter, or at least very close to it, the cloud becomes a true option for extending the datacenter beyond the company’s 4 walls.

Written by John Gannon

August 22, 2008 at 10:45 pm

Posted in Uncategorized

Tagged with ,

%d bloggers like this: