Big data firm Hortonworks revamps pricing to cover both on-premises and cloud
Hortonworks Inc. is kicking off its DataWorks Summit in San Jose, California, this week with the announcement of a new software support subscription that provides unified pricing to organizations as they move between on-premises and Amazon Web Services Inc.-based cloud deployments.
Hortonworks Flex provides a single Hortonworks Data Platform subscription that is transferable between cloud and on-premises deployments. The subscription encompasses both software support and advisory services covering Apache Spark, data preparation tasks known as extract/transform/load, and analytics workloads in the Hortonworks Data Cloud for AWS.
Hortonworks subscriptions were previously sold on a fixed-capacity basis, even if capacity varied due to seasonal or project factors. “If customers wanted to add four nodes, they’d have to add that to the subscription and they couldn’t give them back,” Hortonworks Chief Technology Officer Scott Gnau said in an interview. “This is largely about flexibility and the ability to deploy across different domains and platforms.”
In making the announcement, the company cited Forrester Research Inc.’s 2016 Global Business Technographics Data and Analytics Survey, which said moving into the public cloud is the No. 1 priority for global data and analytics technology decision-makers. About a quarter of Hortonworks’ customers are using the company’s software in the public cloud today.
Hortonworks isn’t publishing the price list and Gnau wouldn’t speculate about whether the change will save customers money. Hortonworks will continue to sell fixed-capacity subscriptions. “In some cases, customers may have both schemes,” he said. “If their environment is predictable most of the time, this is a more economical way” to add capacity temporarily.
The offer is available both to users of the managed Hortonworks Cloud on AWS and to users who choose to build and manage their own clusters. Gnau said the offer would eventually extend to deployments on Microsoft’s Azure cloud platform.
The company is also using its user conference to announce the general availability of version 3.0 of Dataflow, its open-source stream processing platform. DataFlow, which is based on the National Security Agency-developed Apache NiFi project, can be used to manage data flows from edge devices, providing security and streaming analytics through open-source engines such as Apache Storm and Kafka. “You can plug in different streaming engines; it’s engine-agnostic,” Gnau said.
Streams can now be registered in a data dictionary for sharing using metadata descriptions. “Developers can have access to more streams without being the developer who created them,” Gnau said. HDF 3.0 also introduces Streaming Analytics Manager, which allows developers, business analysts and administrators the ability to build streaming applications without writing code.
Image: Pixabay
Since you’re here …
… We’d like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.
If you like the reporting, video interviews and other ad-free content here, please take a moment to check out a sample of the video content supported by our sponsors, tweet your support, and keep coming back to SiliconANGLE.