R. Danes
Latest from R. Danes
Big Data Week insights: Putting heads and tech together to crush complexity
Last week, theCUBE, SiliconANGLE Media’s mobile livestreaming studio, held its premier yearly event, BigData NYC, in conjunction with the Strata Data Conference. Our analysts dug out the very latest big data developments in conversations with innovators, practitioners and vendors. Simultaneously, we hit the ground in Washington, D.C. for Splunk .conf2017. Altogether, it was a bustling week ...
These noble nerds fight human trafficking with data, blockchain analytics
Law enforcement agencies across the globe concur that human trafficking — the capture, transfer, receipt or harboring of humans for various exploitative purposes — is widespread. Available statistics are wildly inconsistent, however; the International Labor Organization has estimated the global number of victims to be 20.9 million, while the Minderoo Foundation Pty Ltd’s Global Slavery Index ...
Cramming AI models into IoT for big data at the edge: analyst predictions
Data is what separates business disruptors from the disrupted in the Digital Age. It has made millionaires and billionaires already, and the game has just teed up. More data means heftier profits — but only if it’s refined and delivered to end users on time and piping hot. Artificial intelligence can speed up data’s ingestion-to-insight cycle, ...
Open-source community pushing big data into AI realm
What’s the surest way to advance a technology in a short time? Give it away — to an open-source community. Seminal big data software library Apache Hadoop gained momentum in open source, and today, most disruptive big data development is springing from open source as well. “If people have the community traction, that is the ...
Splunk plunks down edge analytics blueprint at Splunk .conf2017
Splunk Inc., best know for software that monitors and analyzes machine-generated big data, has joined the movement toward intelligent edge computing. It etched out its (not fully baked) plans to enable data analytics at “internet of things” end points today during the Splunk .conf2017 in Washington, D.C. Splunk does not yet offer much in the way of ...
Voices from BigData NYC: Data-as-a-service is here, along with new tools in the shed
There is no lack of big data software tools nowadays, but businesses often spend much time learning to wield them with little profit to show for it. Could new big data as a service offerings make lighter work of monetization? “There are a plethora of tools, and tools are good. Platforms are better,” said John ...
HPE and Mellanox bring network up to NVMe speed with Ethernet switches
Storage vendors are pushing technologies like flash and Non-Volatile Memory Express, or NVMe, over fabric to meet the demands of modern workloads. But the very best storage can’t whiz through a sluggish, traffic-jammed network. Hewlett Packard Enterprise Co. is incorporating Mellanox Technologies Ltd’s Spectrum Ethernet Fabric Solution into its StoreFabric M-series switches to clear network arteries for data-heavy applications. ...
Demand for pay-per-drip cloud consumption trickles to vendors, partners
Veritas Technologies LLC is revamping its products and brand for multicloud. This move is rippling out to its solution-provider partners, including NetX Information Systems Inc. “Six moths ago — NetX — we weren’t doing anything in cloud,” said Angelo Sciascia (pictured), senior vice president of NetX. NetX began in security and systems management, but in 2013, the company ...
Finishing OpenStack’s unfinished business with Veritas HyperScale
OpenStack’s open-source cloud computing platform has major cost advantages over public cloud providers. But enterprises that deploy workloads in OpenStack soon discover why public clouds can charge customers to manage the technical elements on their behalf. Pulling together all the components of an OpenStack cloud by themselves can be a headache for enterprises, according to Eric Kessels ...
How businesses can repurpose backup data for DevTest and cost savings
Backing up data for recovery in the event of a disaster provides critical insurance for enterprises. But why let potentially valuable data sit there and collect dust when it could be put to work for tests, development and cost savings? Loads of companies would like to do just that — but they can’t work very ...