Smarty-pants bots, artificial intelligence engines and machine-learning servers are taking over. They are photographing and logging checks, researching credit scores, balancing books, monitoring and grading application performance and texting your mother to tell her that you’ve stayed up too late the past few nights. Well, maybe not that last one.
A survey of 1,106 business and technology executives published by the IBM Institute for Business Value, finds that 85 percent of companies are already operating in multicloud environments. Moreover, 98 percent are forecasting they will be using multicloud within three years. These findings should surprise nobody who reads this blog.
However, the survey finds that only 39 percent of the respondents have implemented devops processes and tool chains. From a devops perspective, 51 percent of respondents use multicloud to cultivate a flexible infrastructure that supports agile application deployments.
At the Linux Foundation's inaugural Open FinTech Forum in New York City last week, attendees got a chance to discuss the latest state of open-source adoption and the extent that open-source strategies are changing financial service businesses.
Virtustream Enterprise Cloud is built to run complex, mission-critical enterprise applications such as SAP, with a full suite of professional and managed services. It offers high availability and performance.
The global open source code repository also released new security tools with the GitHub Security Advisory API, new ways to learn across teams with GitHub Learning Lab for organizations, and other items. Oh yes, it also released the annual "State of the Octoverse" report.
A poll of 250 IT decision makers across North America conducted by managed services provider Softchoice that polled found preparation for cloud initiatives is on track. 83 percent of those polled said they had assessed existing applications to determine if they were ready for the cloud, and 82 percent had modernized their data centers in preparation for cloud. Moreover, 72 percent internally communicated the business impact of a cloud strategy.
But there were some surprises in what companies discovered once they move to cloud:
Cloud services like Azure offer a lot of security features straight out of the box, especially if you’re using their platform services. But virtual infrastructures are much like physical infrastructures, connecting virtual machines with software-defined virtual networks. Thus, they need the same security and network management tools as your own data center and your own application infrastructures.
eWEEK DATA POINTS: The cloud is used by some of the most security minded organizations in the world. So why can’t we use it to streamline and strengthen the voting process? What would we need in a cloud-based voting system in order to trust it with something so important?
One of the last computing chores to be sucked into the cloud is data analysis. Perhaps it’s because scientists are naturally good at programming and so they enjoy having a machine on their desks. Or maybe it’s because the lab equipment is hooked up directly to the computer to record the data. Or perhaps it’s because the data sets can be so large that it’s time-consuming to move them.
Microsoft is perhaps the most impressive company on the planet right now. While it doesn’t (currently) dominate markets like it used to, Microsoft has managed something dramatically more difficult, something that portends future success as a platform behemoth: profound cultural change.
Not to name names, but I’ve been reading in several publications that one of the main reasons to go to multicloud is to avoid vendor lockin. While I can see the logic behind this assumption—that having more cloud providers means you can be more independent—the reality is much different.
A book published in 1981, called Nailing Jelly to a Tree, describes software as “nebulous and difficult to get a firm grip on.” That was true in 1981, and it is no less true nearly four decades since. Software, whether it is an application you bought or one that you built yourself, remains hard to deploy, hard to manage, and hard to run.
Docker containers provide a way to get a grip on software. You can use Docker to wrap up an application in such a way that its deployment and runtime issues—how to expose it on a network, how to manage its use of storage and memory and I/O, how to control access permissions—are handled outside of the application itself, and in a way that is consistent across all “containerized” apps. You can run your Docker container on any OS-compatible host (Linux or Windows) that has the Docker runtime installed.
I would have titled this post, “How to be a rainmaker in the cloud,” except the term rainmaker often refers to the selling process, which is already succeeding, and that success is a key contributing factor to why so many cloud initiatives are all wet. If nothing else, the popularity of cloud services has made the use of metaphors in technical articles much easier!
In this post, I write about some of the slipperier aspects of cloud services, how to reap the most benefits, and ways to identity potential pitfalls before a sinking feeling sets in.
AI, big data, and the cloud
A great expression going around about AI and big data is that they are “like teenage sex: Everyone talks about it; nobody really knows how to do it; everyone thinks everyone else is doing it; so everyone claims they are doing it.” I want to add to that that “and most of those that are doing are not having nearly as much fun as they could be.” The same can be said for the cloud, even though it has been around a lot longer and is relatively more mature. And many people actually already have it, though they might not know it, so maybe it is more like insanity.
Adobe is delivering on the promise it made in May when it announced plans to acquire Magento Commerce for $1.68 billion, a deal it said would lead to the creation of a single platform for both B2B and B2C customers globally.
The biggest barrier to effective IoT implementation is lack of internal expertise and skills, according to 31 percent of respondents. Other barriers include the inability to manage and process large volumes of data (29 percent), integration issues (28 percent) and too many legacy systems (28 percent).
There are lots of big cloud shows coming up, and the core themes will be containers, devops integration, and more serverless computing services, such as databases, middleware, and dev tools.
Why the focus on serverless computing? It’s a helpful concept, where you don’t have to think about the number of resources you need to attach to a public cloud service, such as storage and compute. You just use the service, and the back-end server instances are managed for you: “magically” provisioned, used, and deprovisioned.
The serverless cloud computing concept is now white-hot in the cloud computing world, and the cloud providers are looking to cash in. Who can blame them? At the same time, you can take things to a silly level. I suspect there’ll be a few serverless concepts that jump the shark the day they are announced.
Building distributed systems is hard. When you’re working with applications that span a planet, the speed of light is a brake on what you want to do, complicating data replication among servers and services. Someone buys a widget in Hong Kong at almost the same time as someone in Paris, but there’s only one in stock. How do you know who to bill, and who to tell the purchase failed? Whose purchase ends up being the one recorded in your line-of-business tools?
A survey of 300 IT professionals by Fugue, a cloud infrastructure security provider, reveals that most enterprises are vulnerable to security events caused by cloud misconfiguration, including data breaches and system downtime events.
From the report:
Nine in ten have real concerns about security risks due to misconfiguration, and less than a third continuously monitor for them.
Teams report a frequency of 50 or more misconfigurations each day, yet half of the teams only review alerts and remediate issues on a daily—or longer—timeframe, leading to dangerously long infrastructure vulnerability periods.
Of course, this report (like any vendor-sponsored report) is self-serving. But the message reflects something that I’m seeing a lot today in the real world—and it’s scaring the hell out of me.
Flash storage pioneer, which has seen hard times, has fine-tuned its strategy and is going only after what it calls the “extreme-performance” storage market. It also has a new hot-shot flash array to talk about.
Cloud computing has two meanings. The most common refers to running workloads remotely over the internet in a commercial provider’s data center, also known as the “public cloud” model. Popular public cloud offerings—such as Amazon Web Services (AWS), Salesforce’s CRM system, and Microsoft Azure—all exemplify this familiar notion of cloud computing. Today, most businesses take a multicloud approach, which simply means they use more than one public cloud service.
The second meaning of cloud computing describes how it works: a virtualized pool of resources, from raw compute power to application functionality, available on demand. When customers procure cloud services, the provider fulfills those requests using advanced automation rather than manual provisioning. The key advantage is agility: the ability to apply abstracted compute, storage, and network resources to workloads as needed and tap into an abundance of prebuilt services.
Also, one in three enterprises use managed or native Kubernetes orchestration solutions, and 28 percent of enterprises use Docker containers in AWS.
None of this data should surprise you given the explosion of the market, but it is interesting to confirm that enterprises are now quickly moving up the cloud stack. Your focus is moving away from infrastructure services, such as basic storage and compute, to the services the “cool kids” are using, meaning multicloud management, serverless computing, and containers.
Life changed for programmers and operations teams when the cloud arrived. Instead of waiting weeks, months, and sometimes more than a year for new hardware to be purchased and provisioned, the cloud of servers makes it possible to get a new idea up and running in seconds with just a click or three.
For most companies, multicloud and hybrid cloud environments aren’t a choice. They’re just what happens as those companies evolve. So while 451 Research projects that 69 percent of organizations expect to run a multicloud environment by 2019, the reality is that 100 percent are already there. That’s because any company that has set up in the cloud is almost certainly already running in more than one. The reason? Developers.
You are setting up a quick report around sales for the year, and it needs to use three separate databases. One is an object database running on Amazon Web Services. The second and third are relational databases running on Microsoft Azure.
The latest version of the container orchestration system Kubernetes, 1.12, brings to GA the Kubelet TLS Bootstrap, a feature that automates the provisioning of TLS client certificates for Kubelets. Kubernetes 1.12 also adds support for container cluster autoscaling on Microsoft Azure’s virtual machine scale sets.
The third major release of the open-source Kubernetes container orchestration system in 2018 is now out, providing users with a stable release of a key security feature that has been in development for two years, while previewing a new sandboxing isolation capability.
Choosing the right cloud computing architecture depends on your business and technology service requirements. This excerpt from Architecting Cloud Computing Solutions explains the different cloud models including baseline cloud architectures, complex architectures, and hybrid clouds.
Remember Snort? Or Asterisk? Or Jaspersoft or Zimbra? Heck, you might still be using them. All of these open source champions—InfoWorld Best of Open Source Software Award winners 10 years ago—are still going strong. And why not? They’re still perfectly useful.
Ten years ago these tools were among the best answers to pressing needs in the enterprise network—for intrusion detection, call management, reporting, and collaboration. But looking back on them now, you can’t help but think, “Wow. Software was so much simpler then.”
But even as we grapple with the likes of microservice architecture, distributed data processing frameworks, deep neural networks, and “dapps,” we remain steadfast in our commitment to bring you—this year and every year—the best that open source has to offer. Welcome to InfoWorld’s 2018 Best of Open Source Software Awards!