The rise of what I call the retail cloud has been a real mover and shaker the past few years. Today, it's reached critical mass as everything from DVD players to TVs, from car entertainment to alarm clocks, comes with some sort of cloud service to support that device.
For example, Mercedes-Benz announced a new cloud-connected dashboard computer called Mbrace2 that provides access to 3G cellular-connected apps such as Facebook and over-the-air software updates. Ford and Toyota are following up with their own cloud-based systems, providing both driving utilities and entertainment.
Other uses of the cloud are more utilitarian, such as providing storage and processing power for mobile devices, which is old news. But now the same computing models are being used for most entertainment devices in your home, even kitchen appliances that provide "smart grid" features such as the ability to transmit their energy usage and cycle down during peak loads. Pretty much anything that costs more than $100 comes with its own Wi-Fi radio these days.
As you may expect, nobody likes this act due to the potential for abuse. Indeed, in the last 30 days we've seen the Internet in an uproar. This includes a movement to boycott Go Daddy, which has now changed its tune on SOPA from supporting to not supporting.
Although few cloud providers have chimed in on this controversial issue.
The bottom line is that this legislation sounds like a good idea for those who make a living by providing copyrighted content. However, giving government the power to pull domains and block access could lead to instances where the innocent are caught up in a legal mess they can't afford to fight -- without due process.
Piston Cloud Computing
The creator of the Piston Enterprise Operating System, or PentOS, was lauded for his contributions in helping to create cloud computing itself, through the pioneering NASA Nebula project. There, NASA first demonstrated how to fit a data center cluster in an ordinary shipping container, proving the space program can still produce benefits today.
But last year, Joshua McKenty one-upped himself. He fit an almost entirely self-provisioning cloud operating system for a common rack of servers, onto a USB thumb drive. You plug the thumb device into a PC, edit maybe three lines of a text configuration file, save it, unplug it, plug it into the main server in the rack, and turn it on
In a very clever demonstration video, McKenty demonstrates the "Just Add Water" nature of the configuration process. On Wednesday, Piston Cloud's PentOS - the first commercial implementation of OpenStack, born from NASA Nebula - emerged from public beta into general availability. In addition came news that Dell has signed on as a provider of Piston Cloud-certified hardware.
Certified to run on servers from Silicon Mechanics using Arista Network’s switches.
Add France to the list of European countries pushing a nationalistic cloud computing agenda, one that could have huge repercussions for U.S.-based cloud powers and the nature of cloud computing in general.
France Telecom is partnering with Thales SA, a maker of aerospace systems and industrial electronics, to build a homegrown cloud to offer built-in-France software, according to a Bloomberg news report.
And the verbiage is getting heated.
“It’s the beginning of a fight between two giants,” Jean-Francois Audenard, the cloud security advisor to France Telecom, told Bloomberg. Audenard added:
It’s extremely important to have the governments of Europe take care of this issue because if all the data of enterprises were going to be under the control of the U.S., it’s not really good for the future of the European people.
The impetus or justification comes from the U.S. Patriot Act, which allows U.S. law enforcement to force disclosure of cloud-based data if they perceive a security threat. Last spring Microsoft said that it would have to hand cloud data over to U.S. authorities if asked, even if it resides in the company’s European data centers.
In September, Reinhard Clemens, the CEO of Deutsche Telekom’s T-systems group, said local regulators should enable super secure clouds to be built in Germany or elsewhere in Europe. He cited pent-up demand among customers who do not want their information accessible to the U.S. government.
In a conference call with press on the subject of Cloud Computing and Data Privacy, U.S. Deputy Assistant Attorney General Bruce Swartz addressed a number of myths about the cloud and how existing laws pertain to it. Swartz noted that there seems to be some controversy about the ability of U.S. law enforcement agencies to access information in the cloud.
"It's important to recognize that the framework for resolving conflict in the law enforcement context was established long ago as the Budapest Cybercrime Conventions which dates back to 2001," Swartz said. "That convention spells out a framework for access by law enforcement to computer data stored within a particular country."
The other myth about U.S. law enforcement in the cloud that Swartz addressed revolves around the impact of the Patriot Act. Enacted in response to the 9/11 terrorist attacks, the law gives U.S law enforcement broader powers to gather intelligence. Swartz noted that the Patriot Act rationalized existing laws about surveillance, but it didn't work a fundamental change to the issues of stored data. He stressed that the Patriot Act does not affect the Cloud for law enforcement purposes.
"Law enforcement in the U.S, the EU, and around the world have worked out a system through treaties, memorandum of understanding and through their own domestic laws to ensure that we can share information and obtain information from each other and do so while protecting privacy," Swartz said.
Jeff: Deeper Dive in Big Data Analytics
Last week I talked about the “golden nugget” in big data which is the analytics part; the ability to understand and gain new insight and information from both structured and unstructured data sources in near real time. I mentioned Oracle’s release of their big data appliance which comes bundled with Cloudera’s Apache Hadoop, Oracle NoSQL and an open-source distribution of R software. R software is used for predictive analytics and statistical modeling that provides organizations with the Big Data Analytics needed to make sense of the huge volumes of data from multiple sources, hence gaining insight into their organization’s performance and market.
Today, what I have today is a deeper dive into the analytics part of big data; specifically predictive analytics. Predictive analytics is the ability to use huge volumes of historical data to predict actionable intelligence on future patterns. I have an article by Dr. Fern Halper (source: information management, Nov/Dec 2011) that talks about 5 trends in predictive analytics.
1. The user base of predictive analytics tools is expanding from the traditional statistician or quantitative analyst to business analyst and other business roles that are non-technical. – Vendors are responding by making it easier to use predictive analytics tools: UI, shortcuts to model building, model result sharing.
2. Operationalizing predictive models: The ability to build a predictive model and incorporate it as part of the business processing engines and platforms in a near real time. Example, claims insurance fraud detection. Paper by CMS (https://www.cms.gov/MLNMattersArticles/Downloads/SE1133.pdf) using predictive modeling on top of their medical claims data to flag potential fraudulent claims for investigation.
3. Supporting unstructured data analysis: This is basically a big data play. The ability to leverage big data technology as the underlying platform and place text analytics tools onto of big data to be able to analyze and develop predictive models that uses both structured and unstructured data sources.
4. Open Source: The predictive modeling marketing is embracing the open source community in leveraging some of the approaches that has proven successful in other areas of open source development. Example of such predictive analytics tool is R, which Oracle has incorporated into their big data appliance technology stack to help users of the platform to develop predictive models on top of their big data stores.
5. Big Data: And most importantly, the emergence of big data and predictive tools. Fern Halper noted that as part of any big data strategy, it is important to take a step back and ask the business questions that organizations are trying to answer first rather than jumping to develop strategy that focuses on how to store big data. Because through the questions you may find that not all your data sources are relevant to include as part of your big data strategy.