Fri, 27 January 2012
The way AWS Storage Gateway works is by securely storing data as a snapshot image within S3, then porting that data to the AWS Elastic Block Storage service if desired. Once there, users can process that data using Amazon EC2 cloud computing instances. Storage Gateway keeps data on local gear while asynchronously uploading it to Amazon’s cloud. This lets companies leverage the cloud when they need it, but helps eliminate latency concerns that come with uploading large amounts of data to the cloud for backup, as well as with using local storage and cloud-based resources.
One of the more interesting attributes of cloud computing is the ability to invoke compute-intensive resources as a service. Rather than having to acquire and deploy IT infrastructure to handle peak loads or processing requirements, IT organizations can “rightsize” their IT investments to handle their average workload requirements. Any time their computing requirements exceed those average workload limits, they can invoke additional compute capacity in the cloud using a process known as “cloudbursting.”
It’s still early days in terms of the mainstream adoption of “cloudbursting,” but the concept is already leading to the creation of some new business models. For example, Pixar has announced that it is partnering with GreenButton and Microsoft to deliver a new rendering service in the cloud that will be managed by GreenButton using its namesake cloud management software running on the Microsoft Azure cloud computing platform.
The basic idea, says GreenButton CEO Scott Houston, is to make it a lot more affordable for film studios, advertising agencies or anybody else using Pixar’s RenderMan software to access the compute resources needed to create animated films in 3D. According to Houston, the new Pixar service should significantly lower the cost of making these films, which, in turn, should make the technology more accessible.
Obviously, the Pixar offering is only one instance of a cloudbursting application. The cloudbursting concept could easily be applied across any number of vertical industries, which should result in a number of new and interesting business models in 2012 that could never have been enabled without the cloud.
The Digital Agenda Commissioner for Europe, Neelie Kroes, has announced that the European Commission will give €10m towards a new Cloud Computing Partnership, to try and help aid “flagging economies” on the continent.
The announcement was made at the World Economic Forum in Davos yesterday, amidst considerable hype surrounding Europe’s current financial perils.
The Partnership will involve a new European-wide IT procurement process, much in the vein of the UK’s current G-Cloud initiative, which Kroes hopes will push new secure and cost-effective strategy towards cloud adoption on the continent.
Kroes spoke in relation to new EU data protection laws outlined earlier this week by EU Justice Commissioner Viviane Reding, which will give people and organisations the right to ask for personal data to be deleted from servers hosted by third parties, who will have to comply unless they have “legitimate” grounds to retain it.
However, some industry experts are sceptical about the proposed changes, with the US Patriot Act creating a potential loophole if the data is hosted by an American cloud provider and meaning the US government could still have the power to examine any data it deemed incriminating, even on foreign shores.
“In the first phase, the Partnership will come up with common requirements for Cloud procurement. For this it will look at standards; it will look at security; it will look at ensuring competition, not lock-in.
“In the second phase the Partnership will deliver proof of concept solutions for the common requirements.
“In the third phase reference implementations will be built.”
Jeff: 7 Ways to Do big Data right using cloud
DaaS Defined: Author – Bob Violino on InfoWorld
An emerging type of service called data as a service, or DaaS, promises to help companies wanting to tap the “big data” floating in the “Internet of things” for competitive advantage and innovation. With DaaS, organizations can gain access to information they need on an on-demand basis, much like they acquire applications via software as a service (SaaS) and storage, servers, and networking components through infrastructure as a service (IaaS). Data is stored by the service provider and accessible to users from the Internet.
1. Create a "data mind-set" - a significant IT cultural shit need to occur
2. But don't neglect infrastructure - if any org planning to leverage DaaS with persisting data in-house, you need to leverage IaaS capabilities
3. Try before you buy, check references, and insist on SLAs - basically trust, but verify with current customers of DaaS vendors
4. Build a strong governance mechanism - This is where adopters of DaaS need to focus much on. Without a clear data governance practice, you are bound to get in to a lot of data quality issues that will cripple your business intelligence and operational processes
With DaaS, extremely large amounts of data come in to organizations from a variety of sources and with varying degrees of criticality and requirements for privacy and security.
Organizations need to have strong governance around standards, guidelines, and policies related to DaaS. "Data governance plays a critical role in data services, ensuring that applications, users, and processes get the right data which they have access to and [that] the data is trusted,"
5. Emphasize data quality - Ensure that DaaS data quality is measure and know how its measures so you can determine quality of the data you are paying for.
6. Ramp up your analytics skills - Last week, I talked much about this area and you can refer to last week podcast, but the basic message here is, apply the data using analytics tools & resources.
7. Know when to use DaaS and how to measure results
Direct download: Cloud_Computing_Podcast_Ep_180.mp3
-- posted at: 9:34 PM
Fri, 20 January 2012
The rise of what I call the retail cloud has been a real mover and shaker the past few years. Today, it's reached critical mass as everything from DVD players to TVs, from car entertainment to alarm clocks, comes with some sort of cloud service to support that device.
For example, Mercedes-Benz announced a new cloud-connected dashboard computer called Mbrace2 that provides access to 3G cellular-connected apps such as Facebook and over-the-air software updates. Ford and Toyota are following up with their own cloud-based systems, providing both driving utilities and entertainment.
Other uses of the cloud are more utilitarian, such as providing storage and processing power for mobile devices, which is old news. But now the same computing models are being used for most entertainment devices in your home, even kitchen appliances that provide "smart grid" features such as the ability to transmit their energy usage and cycle down during peak loads. Pretty much anything that costs more than $100 comes with its own Wi-Fi radio these days.
As you may expect, nobody likes this act due to the potential for abuse. Indeed, in the last 30 days we've seen the Internet in an uproar. This includes a movement to boycott Go Daddy, which has now changed its tune on SOPA from supporting to not supporting.
Although few cloud providers have chimed in on this controversial issue.
The bottom line is that this legislation sounds like a good idea for those who make a living by providing copyrighted content. However, giving government the power to pull domains and block access could lead to instances where the innocent are caught up in a legal mess they can't afford to fight -- without due process.
Piston Cloud Computing
The creator of the Piston Enterprise Operating System, or PentOS, was lauded for his contributions in helping to create cloud computing itself, through the pioneering NASA Nebula project. There, NASA first demonstrated how to fit a data center cluster in an ordinary shipping container, proving the space program can still produce benefits today.
But last year, Joshua McKenty one-upped himself. He fit an almost entirely self-provisioning cloud operating system for a common rack of servers, onto a USB thumb drive. You plug the thumb device into a PC, edit maybe three lines of a text configuration file, save it, unplug it, plug it into the main server in the rack, and turn it on
In a very clever demonstration video, McKenty demonstrates the "Just Add Water" nature of the configuration process. On Wednesday, Piston Cloud's PentOS - the first commercial implementation of OpenStack, born from NASA Nebula - emerged from public beta into general availability. In addition came news that Dell has signed on as a provider of Piston Cloud-certified hardware.
Certified to run on servers from Silicon Mechanics using Arista Network’s switches.
Add France to the list of European countries pushing a nationalistic cloud computing agenda, one that could have huge repercussions for U.S.-based cloud powers and the nature of cloud computing in general.
France Telecom is partnering with Thales SA, a maker of aerospace systems and industrial electronics, to build a homegrown cloud to offer built-in-France software, according to a Bloomberg news report.
And the verbiage is getting heated.
“It’s the beginning of a fight between two giants,” Jean-Francois Audenard, the cloud security advisor to France Telecom, told Bloomberg. Audenard added:
It’s extremely important to have the governments of Europe take care of this issue because if all the data of enterprises were going to be under the control of the U.S., it’s not really good for the future of the European people.
The impetus or justification comes from the U.S. Patriot Act, which allows U.S. law enforcement to force disclosure of cloud-based data if they perceive a security threat. Last spring Microsoft said that it would have to hand cloud data over to U.S. authorities if asked, even if it resides in the company’s European data centers.
In September, Reinhard Clemens, the CEO of Deutsche Telekom’s T-systems group, said local regulators should enable super secure clouds to be built in Germany or elsewhere in Europe. He cited pent-up demand among customers who do not want their information accessible to the U.S. government.
In a conference call with press on the subject of Cloud Computing and Data Privacy, U.S. Deputy Assistant Attorney General Bruce Swartz addressed a number of myths about the cloud and how existing laws pertain to it. Swartz noted that there seems to be some controversy about the ability of U.S. law enforcement agencies to access information in the cloud.
"It's important to recognize that the framework for resolving conflict in the law enforcement context was established long ago as the Budapest Cybercrime Conventions which dates back to 2001," Swartz said. "That convention spells out a framework for access by law enforcement to computer data stored within a particular country."
The other myth about U.S. law enforcement in the cloud that Swartz addressed revolves around the impact of the Patriot Act. Enacted in response to the 9/11 terrorist attacks, the law gives U.S law enforcement broader powers to gather intelligence. Swartz noted that the Patriot Act rationalized existing laws about surveillance, but it didn't work a fundamental change to the issues of stored data. He stressed that the Patriot Act does not affect the Cloud for law enforcement purposes.
"Law enforcement in the U.S, the EU, and around the world have worked out a system through treaties, memorandum of understanding and through their own domestic laws to ensure that we can share information and obtain information from each other and do so while protecting privacy," Swartz said.
Jeff: Deeper Dive in Big Data Analytics
Last week I talked about the “golden nugget” in big data which is the analytics part; the ability to understand and gain new insight and information from both structured and unstructured data sources in near real time. I mentioned Oracle’s release of their big data appliance which comes bundled with Cloudera’s Apache Hadoop, Oracle NoSQL and an open-source distribution of R software. R software is used for predictive analytics and statistical modeling that provides organizations with the Big Data Analytics needed to make sense of the huge volumes of data from multiple sources, hence gaining insight into their organization’s performance and market.
Today, what I have today is a deeper dive into the analytics part of big data; specifically predictive analytics. Predictive analytics is the ability to use huge volumes of historical data to predict actionable intelligence on future patterns. I have an article by Dr. Fern Halper (source: information management, Nov/Dec 2011) that talks about 5 trends in predictive analytics.
1. The user base of predictive analytics tools is expanding from the traditional statistician or quantitative analyst to business analyst and other business roles that are non-technical. – Vendors are responding by making it easier to use predictive analytics tools: UI, shortcuts to model building, model result sharing.
2. Operationalizing predictive models: The ability to build a predictive model and incorporate it as part of the business processing engines and platforms in a near real time. Example, claims insurance fraud detection. Paper by CMS (https://www.cms.gov/MLNMattersArticles/Downloads/SE1133.pdf) using predictive modeling on top of their medical claims data to flag potential fraudulent claims for investigation.
3. Supporting unstructured data analysis: This is basically a big data play. The ability to leverage big data technology as the underlying platform and place text analytics tools onto of big data to be able to analyze and develop predictive models that uses both structured and unstructured data sources.
4. Open Source: The predictive modeling marketing is embracing the open source community in leveraging some of the approaches that has proven successful in other areas of open source development. Example of such predictive analytics tool is R, which Oracle has incorporated into their big data appliance technology stack to help users of the platform to develop predictive models on top of their big data stores.
5. Big Data: And most importantly, the emergence of big data and predictive tools. Fern Halper noted that as part of any big data strategy, it is important to take a step back and ask the business questions that organizations are trying to answer first rather than jumping to develop strategy that focuses on how to store big data. Because through the questions you may find that not all your data sources are relevant to include as part of your big data strategy.
Direct download: Cloud_Computing_Podcast_Ep_179.mp3
-- posted at: 2:56 PM
Fri, 13 January 2012
A recent study of technology job want ads shows a 61 percent increase in demand nationwide for IT professionals with cloud computing experience.
The study, which is based on ads posted by about 2,400 businesses, was performed by Wanted Analytics, a firm that provides business intelligence for the talent marketplace.
Professionals who are looking to get on the cloud hiring bandwagon may need to acquire new cloud technology skills and certifications.
To help fill the demand for qualified IT professionals, some leading IT vendors are now offering education and certification programs. Case in point is Hewlett-Packard, which launched a suite of new cloud computing certifications this past November. The certifications offered by HP are designed to give IT professionals the skills required to design and deploy a cloud environment.
The CB Insights 2011 Venture Capital Report, released Thursday, showed that Internet or web companies took in a healthy $10.5 billion in 1185 deals total last year, although the number of deals tailed off in the fourth quarter.
The report itself doesn’t even mention cloud computing per se, but CB Insights Co-Founder Jonathan Sherry was a good sport and took a shot at parsing out the true cloud computing numbers.
Via email, Sherry said:
Over the course of 2011, we’ve seen a steady uptick of venture investment into cloud-based software and services. In Q4, cloud-based companies comprised 26.5% of internet deal volume and 34.5% of internet investment dollars. Dollar share skewed higher than deal share due to mega deals including Dropbox and Box.net.
For its purposes, CB Insights put Platform-as-a-Service (PaaS), Software-as-a-Service (SaaS) and cloud storage companies like the aforementioned Dropbox and Box.net – or any true cloud infrastructure players in this category. It did not include game companies or those that run part of their business on Amazon Web Services, for example, in that number.
Some VCs agreed that the categorization remains tricky, especially as any new company now claims cloud credentials whether it has any or not.
There will be new VC numbers out of the National Venture Capital Association and PriceWaterhouseCoopers coming out Jan. 19, so expect still more confusion over what deals are true cloud vs. the product of cloud washing.
Other interesting tidbits from the report:
- Healthcare hit a five quarter high on funding and showed solid deal activity. Sector reverses downward trend and shows pulse to close out 2011.
- Massachusetts took back its #2 spot for both deals and dollars with deals across multiple sectors. Healthcare was particularly strong. Internet investment in the state was the one weak spot in an otherwise strong showing.
- Cali and NY account for 59% of internet VC deals and 66% of funding. Internet investments took over 1/3 of venture dollars in 2011. Massachusetts fell to a five quarter low on internet deal share. Mega-deals in the data storage (Dropbox and Box.net) buoy funding.
- Cali still the top spot for VC deals and dollars. ‘Nuff said.
- We’ll bring back our “Sleeping in Seattle” metaphor as Washington continued its multi-quarter dealflow slide and also came down on funding.
Cloud-related startups we’ve spoken about recently on the show. This list is of companies that haven’t been acquired, as there’s an equally long list of those that have been snatched up by the likes of IBM, Oracle, Microsoft, Rackspace, Google, and SAP:
Big Data Analytics is the “Golden Nugget in Big Data”
Traditional data warehouse as served its purpose and continues to serve the goals it is designed for. But with more and more data explosion from multiple data sources and as well as multiple formats, it is becoming prevalent that the traditional data warehouse is been stretched belong its capabilities. At the same time, organizations increasingly need information from their vast amounts of data to make decision in near real time. This is the result of Big Data Analytics.
Big Data is a Big Deal (Source: TechTarget.com: )
Traditional data warehousing is a large but relatively slow producer of information to business analytics users. It draws from limited data resources and depends on reiterative extract, transform and load (ETL) processes. Customers are now looking for quick access to information that is based on culling nuggets from multiple data sources concurrently. Big Data analytics can be defined, to some extent, in relationship to the need to parse large data sets from multiple sources, and to produce information in real-time or near-real-time.
· Big Data Flavors:
o Scalable Databases (NOSQL)
o Real-time Streaming
o Big Data Appliance
o Big Data Storage
· Big Data Analytics: The ability to understand and gain new insight and information from both structured and unstructured data sources in near real time, is the key to encouraging organizations to jumping on this new approach of information management. It is not necessarily the storage part of Big Data that management wants hear about even though they are inter-dependent.
Oracle’s release of their big data appliance which comes bundled with Cloudera’s Hadoop, Oracle NoSQL and an open-source distribution of R software. R software is used for predictive analytics and statistical modeling the provides organizations with the Big Data Analytics needed to make sense of the huge volumes of data from multiple sources, hence gaining insight into their organization’s performance and market.
Direct download: Cloud_Computing_Podcast_Ep_178.mp3
-- posted at: 2:47 PM
Mon, 9 January 2012
Research firm IDC is more bullish, estimating that worldwide IT spending will grow 6.9 percent year over year to $1.8 trillion in 2012. A healthy chunk of spending -- as much as 20 percent, IDC says -- will be driven by a handful of technologies that are reshaping the IT industry: smartphones, media tablets, mobile networks, social networking, and big data analytics.
Mobility is introducing significant management and security headaches for IT, while at the same time enabling the business to increase employee productivity and improve customer service. Social networking is spawning a treasure trove of customer data, but also creating an enormous challenge for companies that want to make any sense of all that data.
Fighting for IT talent
As companies try to balance the technical challenges and opportunities, they're also grappling with a shortage of skilled professionals. IT pros with application development, virtualization or cloud computing skills are in short supply, as are those with business analytics expertise.
- AppFog - PaaS that has moved away from PHP beginnings to the open source Cloud Foundry as its core (can now support many languages). This is the right move, however, PaaS is going to be a highly contested market this year so AppFog has their work cut out for them.
- Bromium - virtualization technology as a tool for securing the myriad endpoints (e.g., desktops, mobile phones and tablets) that connect to enterprise networks. I agree, the advent of consumerization means endpoints need security. The offerings aren’t widely known - indicators are that the launch may be in the near future.
- Cloudability - We covered Cloudability a few podcasts ago. This is the centralized control dashboard and monitor for all of your cloud spending. Exercise control over potential cloud sprawl and audit control to identify misuse.
- CloudSigma - IaaS competitor that will be competing with the likes of Amazon Web Services and Rackspace. Their mission seems to be to give customers lots of power and lots of control that is comparative to co-location models. Sitting in the impressive SuperNAP data center and offering 10 GbE interconnects as well as solid-state drives it’ll be interesting to see how their customer-base develops.
- Kaggle - I’m a fan of CrowdSourcing where you can make it easier to engage the scientific community in order to achieve the equivalent of peer-reviewed optimization for your data analytics. It’s interesting to note that they used to use AWS but have moved to MS Windows Azure.
- Nebula - Nebula ties OpenStack to an optimized hardware platform designed to make building public clouds a plug-and-play experience. While they are pushing a commercial version of the OpenStack platform like others this is one to watch given the purpose-built appliance and pedigree of the founders (ex NASA CTO Chris Kemp among them).
- Parse - Bringing a cloud component to mobile apps, Parse is optimized for that purpose. It will be difficult to distinguish itself from competitors such as Stackmob, as well as from web-app PaaS offerings such as Heroku and AppFog, but Parse seems to have the right ideas in mind. It has a backend focused on the needs of mobile apps, and a frontend designed for mobile developers that might not have extensive programming chops.
- ScaleXtreme - Everyone needs server-management software, but not everyone needs the big, expensive software offered from traditional software vendors, or even wants to manage software at all. ScaleXtreme gives users a cloud-based service to manage both physical and cloud-based servers, and, it says, has also garnered a lot of interest from cloud providers thinking it might be a good value-added service to their users who want more control.
- SolidFire - Solid state storage and a guaranteed QoS, SolidFire wants nothing less than to revolutionize cloud computing by making it palatable to large enterprises wanting to run mission-critical applications. The company targets cloud providers with SSD-based storage systems that make it possible to store virtual machine images in the cloud and still deliver high performance. Cloud providers utilizing SolidFire gear could find themselves hosting far more relational databases and other applications that presently remain in house.
- Zillabyte - still operating in private beta mode, wants to provide users with both data sets and the algorithms needed to process them. Data sets aren’t uncommon on the web, but they usually don’t come with algorithms and a processing backend. The service will initially focus on web data and text-based algorithms, but there’s plenty of room for growth into new types of data and algorithms as the service matures.
Gartner research director Colleen Graham said that cross-discipline capabilities with master data management have increased the market’s importance, especially as trends like social data, big data and cloud computing ramp up.
“Pressures to optimize costs and efficiencies in a heterogeneous IT environment are driving organizations to turn to MDM as a more efficient way to manage and maintain data across multiple sources,” Graham said in a news release. “In addition, the increasing governance, risk and compliance regulations are forcing organizations to focus on MDM to support these initiatives.”
Under the MDM software market umbrella, customer data and product data are both expected to more than double over the next four years, Gartner states. Research pegs customer data software to hit $644 million by the end of this year and $1 billion by 2015. Product data, including store content information and metadata, is projected to reach $688 million in 2012 and eclipse $1.1 billion in four years. Gartner also foresees more MDM focus in the areas of open source, cloud and SaaS as a response to enterprise demands for lower prices and flexibility.
More than half of all revenue in the MDM market is led by offerings from small and best-of-breed vendors, according to Gartner. That means the overall market surge will likely bring with it a host of smaller competitive mergers in addition to more specialty vendor buyouts by the three largest MDM providers – IBM, Oracle and SAP.
Worldwide master data management (MDM) software revenue will reach $1.5 billion in 2010, a 14 percent increase from 2009, according to Gartner, Inc. MDM is being adopted to support numerous business and IT efforts that deliver revenue, service, agility and risk management improvement, cost reduction and integration simplification.
“MDM is a technology-enabled business discipline in which business and IT organizations work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the organization's official, shared master data assets,” said John Radcliffe, research vice president at Gartner. “Today, most organizations juggle multiple sets of business and data applications across corporate, regional and local systems. At the same time, customers are demanding faster and more complex responses from organizations, leading to an inconsistency that hinders the organization's ability to measure and move within the market. With MDM, CIOs can create a unified view of existing data, leading to greater enterprise agility, simplified integration and, ultimately, improved profitability.”
- From 2009 through 2014, MDM software markets will grow at a Compound Annual Growth Rate (CAGR) of 18 percent, from $1.3 billion to $2.9 billion.
- MDM is a fast growing software market that is attracting a lot of attention, and it continued to exhibit double-digit growth, even through the worst of the global recession. The emerging master data domains (e.g., supplier, human resource, asset and location) continue to exhibit even more rapid growth from a smaller, base. MDM growth is being driven by niche providers, as well as established players. Gartner foresees a larger, more unified MDM software market reaching nearly $3 billion by 2014.
- By 2015, 10 percent of packaged MDM implementations will be delivered as software as a service (SaaS) in the public cloud.
- MDM today is typically implemented on-premises. This is partly because MDM software providers have, so far, not created specific MDM-as-a-service products that are scalable and elastic or multitenanted, and also because there is reluctance in many organizations to place such important, heavily shared data as master data outside the firewall. However, on-premises MDM solutions are increasingly being integrated with SaaS applications, and there are examples of MDM solutions already being implemented in the public cloud.
- Through 2015, 66 percent of organizations that initiate an MDM program will struggle to demonstrate the business value of MDM.
- If IT departments initiate an MDM initiative, they often struggle to get the business on board and to demonstrate the business value of MDM, particularly if there are no business-process-oriented metrics and financial quantifications to define and measure success. MDM needs to align with the business vision and strategy, and will require executive business sponsorship, strong involvement of business stakeholders and change management.
Direct download: Cloud_Computing_Podcast_Ep_177.mp3
-- posted at: 1:20 AM