Cost savings… elasticity…. scalability…. load “bursting”…. storage on demand… These are the advertised benefits of cloud computing, and they certainly help make for a solid business case for using either third-party services or a virtualized data center.

But after the agreements are signed, systems and processes are set up, and users are retrained, something unexpected happens. The initial use cases are realized, but then additional benefits begin to emerge — sort of like the icing on the cake, but often, these unforeseen benefits provide far more value to the business than initially planned.

Over the past couple of years, I have spoken with many CIOs and executives who not only talk about the unplanned challenges, but also cite pleasant surprises as well. Here are some examples of such unexpected benefits that emerge as cloud projects roll along:

Read more …

You may not think your business has any information stored in the cloud, but it does. (Quick: Who handles your email spam filter?)

And whatever business functions you already use the cloud for are only poised to increase.

“Ultimately, I don’t think you have a whole lot of choice,” said Marty Metz, chief information officer at Pillsbury Winthrop Shaw Pittman, speaking at a Nashville Business Journal panel on cloud computing Tuesday morning.

Building on the metaphor of in-house data storage as money kept under your bed or in a sock drawer, while the cloud is more like a bank, Metz and four other panelists focused on the costs versus benefits of cloud computing, as well as its risks and its myths.

Read more …

The cloud has created a paradigm shift that’s every bit as important as the industrial revolution for businesses and consumers, says Gary Turner, managing director, Xero UK. For firms that haven’t already made the most of it, the opportunity to re-imagine services and create innovate new ways to add real value for clients is beckoning.

The year 2014 has heralded a real turning point in the maturity of digital. Almost three billion people – 40% of the world’s population – are using the Internet according to the latest figures from the United Nations. Facebook has 1.35 billion active monthly users, which is incredible when you consider the site has only been around for ten years.

Where can digital take you?

With the web becoming a way of life around the world, commercially the cloud as a mechanism is no longer the primary focus. Rather, what’s important is the benefit that digital can bring. In this respect, it’s not about the engine, it’s about all the new places that engine can take you.

The emergence of cloud-hosted digital innovation is a shift that’s every bit as important as the industrial revolution two centuries ago – it’s honestly that mould-breaking. History shows that the invention of steam power radically rewrote the entire fabric of production, economics and employment. In the longer term, it even ended up rewriting the fabric of Western society itself.

Read more …

Four years ago, the National Security Agency realized it had a data problem.

In the aftermath of the Sept. 11, 2001, terrorist attacks, armed with new authorities that allowed the intelligence agency to collect an assortment of global communications and Internet metadata, NSA ramped up its operations. These expanded surveillance efforts—many of which were disclosed publicly in leaks by former NSA contractor Edward Snowden—produced a massive volume of data that began to flood the agency’s disparate databases.

By the mid-2000s, NSA was adding racks of servers to quell the voracious appetite of multiple programs, a costly effort that further burdened intelligence analysts who had to access dozens or more databases and perform individual queries against each of them. Analysts could spend 30 minutes or longer each morning just opening the databases they needed to query, a senior NSA official told Government Executive.

“What we were doing was just adding servers, and that doesn’t scale at a certain point,” the senior official says. “We had a volume issue—more than a volume issue, our analysts are swamped going in and out of repositories.”

The Best Bet

NSA officials picked up on Google research in 2007 that ultimately paved the way for the intelligence agency’s formal adoption of cloud computing in 2010—part of a broad effort within the intelligence community to more effectively share data and services. In a similar move, the CIA last year signed a contract with Amazon Web Services for a cloud to be used by all 17 intelligence agencies.

Read more …

With FCC officials indicating that they would establish rules regarding location accuracy for indoor 911 calls from cell phones, the four nationwide wireless carriers on Friday announced a voluntary agreement with two key public-safety organizations on the topic, although other first-responder groups expressed objections to the deal.

In the agreement signed with the National Emergency Number Association (NENA) and the Association of Public-Safety Communications Officials (APCO), the four nationwide U.S. cellular carriers—AT&T, Sprint, T-Mobile and Verizon—promise to provide public safety with a “dispatchable location” for many 911 calls and meet benchmarks that begin within two years.

“The proposed solution harnesses the availability of Wi-Fi® and Bluetooth® technologies that are already deployed and expected to expand significantly in the near term,” according to a joint press release issued Friday afternoon from the parties to the agreement.

Features of the agreement include timelines to verify technologies and vendor performance in a testbed environment; to demonstrate, implement and develop standards for database and handset capabilities; and to improve existing location-based technologies both indoors and outdoors.

Read more …

Indiana Chief Information Officer Paul Baltzell recently spoke with StateScoopTV to discuss the Hoosier state’s efforts to harness big data — and highlights his other top projects over the coming months.

Indiana recently completed its Management and Performance Hub Technology Center, which aims to bring greater effectiveness, efficiency and transparency to state government by using real-time data technology.

Read more …

Spending on public sector cloud services is expected to grow from $56.6 billion in 2014 to more than $127 billion in 2018, according to figures from analyst firm IDC.

The increase represents a five-year compound annual growth rate (CAGR) of 22.8 per cent, over six times the growth rate for the overall IT market.

The firm said public sector IT cloud services would account for over half of worldwide software, server and storage spending growth in 2018.

“Over the next four-to-five years, IDC expects the community of developers to triple and to create a ten-fold increase in the number of new cloud-based solutions,” said Frank Gens, senior vice president and chief analyst at IDC.

“Many of these solutions will become more strategic than traditional IT has ever been. At the same time, there will be unprecedented competition and consolidation among the leading cloud providers.”

Read more …

Spending on public cloud computing services is forecast to grow at six times the rate of the overall information technology market over the next five years, IDC says. The research firm predicts that public IT cloud spending will hit $127.5 billion in 2018, up from $56.6 billion this year.

That represents a five-year compound annual growth rate of 22.8%. In 2018, public IT cloud services will account for more than half of worldwide software, server, and storage spending growth, IDC said in a report Monday.

The cloud services market is entering an “innovation stage” that will produce an explosion of new services and value creation on top of the Internet cloud, IDC said.

Read more …

 

The Internet of Things (IoT) is starting to make waves in law enforcement. From connected guns that remember exactly when and how they were fired to wearable smart devices designed for police dogs, the IoT is becoming a go-to solution not only to improve law enforcement officers’ capabilities, but also to increase accountability and public safety.

Here are some examples of IoT products and services that are just beginning to have an impact on law enforcement.

Read more …

Susan L. Cutter is a Carolina Distinguished professor of geography at the University of South Carolina where she directs the Hazards and Vulnerability Research Institute. Her primary research interests are in the area of disaster vulnerability/resilience science — what makes people and the places where they live vulnerable to extreme events and how vulnerability and resilience are measured, monitored and assessed.

Cutter is a GIS hazard mapping guru who supports emergency management functions. I posed a series of questions about mapping and asked her to respond in writing. In Cutter’s responses she reminds us to ask the “why of the where” question when looking at maps.

What has been the evolution of hazard mapping in the United States, and how does that compare with what is being done in other countries?

Hazard mapping has a long history here in the U.S. going back to the 1960s with the work of Gilbert F. White and his insistence that we map not only where the hazards are, but where people live and work relative to the risks, what he called the human occupance of hazardous areas. Hazard mapping has evolved hand-in-hand with the understanding that we can never truly control nature. Mapping has shifted from a focus on the event itself (modeling physical processes) toward a focus on understanding interactions between people and the environment. The U.S., because of the large diversity in possible hazard threats to the nation, has become a leader in hazard mapping and the integration of new tools and technologies (such as GIS, remote sensing, GPS) into the emergency management cycle.

Read more …

css.php