Humans are not always perfectly rational decision-makers, and more often than not log file analysis is an investment made by human decision-makers only after conditions have forced their hand. Complicating matters, there’s a lot of media attention on the disruptive impact of the cloud computing paradigm and the technologies that it enables — commodity computing, software as a service, infrastructure as a service and big data. In this article we will discuss service delivery modes — specifically, the question of on-premises vs. cloud.
on-premises doesn’t automatically mean ‘secure’
On-premises installations are traditionally thought of as more secure and less prone to attack than cloud-based systems. For this reason, security-conscious organizations strongly prefer on-premises solutions to the exclusion of public cloud. Oftentimes, security conditions will dictate that this will be an “air-gapped” private cloud infrastructure — machine data intelligence within the defense and aerospace spheres are prime examples.
Recent security work shows, however, that this traditional thinking may need to be revisited: within the commercial sphere, five times as many on-premises data centers are attacked by malware and botnets as cloud installations, and the rise in attacks on cloud data centers shows increasing consistency with attacks on on-premises data centers. And, as recent high-profile leaks from closed infrastructural systems in defense show, even high-priority, air-gapped, intensely secured networks can be compromised by as little as a USB key.
The take-home lesson from these high-profile examples is that no matter how large, complex and expensively secure it is by itself, on-premises technology is only as secure as your organization is.
in big organizations, “where” comes first
The “where” of business intelligence is often the first question that must be answered: Will the solution be an on-premises one, running on hardware belonging to us, or will it be a service provided to us by an outside organization?
In very large organizations a primary and decisive factor in “where” is an organization-wide move towards a private- or hybrid-cloud commodity computing paradigm. Class-leading examples of these type of business needs are military IT initiatives: the SITEC-II cloud infrastructure project for United States Special Operations Command, similar initiatives under way in the Army at large, the Air Force, and even the the Navy’s experimental ship-to-ship network system. At a similar level of scale, massive partner networks like those belonging to Microsoft or Red Hat Enterprise Linux are being guided towards adoption of cloud technologies for operations and development.
Despite being named “cloud” or “as-a-service”, it is inaccurate to describe these solutions as entirely “cloud”, by which most people people mean “public cloud”. The driving concern in these private cloud deployments is really creating the cost savings of working with cloud while retaining the inherent security and ease of deployment that come with on-premises technology — important factors that must be balanced for any organization making a “where” decision.
in small to medium-sized businesses, implementation or “how” is a prime deciding factor
Implementation is a more imperative factor for machine data intelligence in small businesses than in large ones. Money and resources aren’t the only difference between small and big business operationally — time is money, and they also treat time quite differently. Thus, a very common pain point we’ve heard from small businesses starting out in Big Data or the Internet of Things is that they need their machine data intelligence solution yesterday — taking months to a year to install a complex solution is simply not possible. These businesses are often investing in machine data intelligence once the need for it becomes manifestly clear, and that’s often some kind of time-sensitive business crisis.
Within small-to-medium sized businesses, initial cost and total cost of ownership (TCO) are also factors in choosing a machine data intelligence platform. While a very large business or government entity might be able to absorb a large initial operational expense, smaller entities consider initial and recurring cost to be a much bigger deal. New businesses chasing a profit in particular are concerned about amortization, hardware costs over time, additional staff and lessening liability exposure.
Traditionally, TCO calculation factor in issues like manageability, cost of hardware over time, security, scalability in response to “burst” conditions, and integration with existing/prospective ERP. They should also include less tangible factors like opportunity costs and time-motion costs for business people. These costs are quite real as well — it’s a documented fact that faster, easier deployments free up development and operations time for activities that can more directly add to the company’s value.
for the Internet of Things and Industrial Internet, “why” and “what” is important
We have a particular interest in the Internet of Things and Industrial Internet businesses. These businesses have a characteristic data need: industry-wide, the trend is towards integrating disparate machine systems into “data platforms” capable of generating and consuming information from a diverse set of sources in real time. Here, applying the time-ordered, “append-only” paradigm of log data provides powerful benefits in ordering and analyzing the the unique format and high volume of machine-to-machine communication.
When you choose a solution it makes sense to answer these fundamental questions that we’ve touched on above: why, what, and how should determine where and thus who:
As we explore further the advantages and comparative strengths and weaknesses of cloud and on-premise deployments, we welcome discussion and the chance to talk further about the whys, whats and hows of your business and its data analysis needs. Contact us to learn more and sign up for our beta.