Apr
02
2019
--

Edgybees’s new developer platform brings situational awareness to live video feeds

San Diego-based Edgybees today announced the launch of Argus, its API-based developer platform that makes it easy to add augmented reality features to live video feeds.

The service has long used this capability to run its own drone platform for first responders and enterprise customers, which allows its users to tag and track objects and people in emergency situations, for example, to create better situational awareness for first responders.

I first saw a demo of the service a year ago, when the team walked a group of journalists through a simulated emergency, with live drone footage and an overlay of a street map and the location of ambulances and other emergency personnel. It’s clear how these features could be used in other situations as well, given that few companies have the expertise to combine the video footage, GPS data and other information, including geographic information systems, for their own custom projects.

Indeed, that’s what inspired the team to open up its platform. As the Edgybees team told me during an interview at the Ourcrowd Summit last month, it’s impossible for the company to build a new solution for every vertical that could make use of it. So instead of even trying (though it’ll keep refining its existing products), it’s now opening up its platform.

“The potential for augmented reality beyond the entertainment sector is endless, especially as video becomes an essential medium for organizations relying on drone footage or CCTV,” said Adam Kaplan, CEO and co-founder of Edgybees. “As forward-thinking industries look to make sense of all the data at their fingertips, we’re giving developers a way to tailor our offering and set them up for success.”

In the run-up to today’s launch, the company has already worked with organizations like the PGA to use its software to enhance the live coverage of its golf tournaments.

Feb
20
2019
--

Arm expands its push into the cloud and edge with the Neoverse N1 and E1

For the longest time, Arm was basically synonymous with chip designs for smartphones and very low-end devices. But more recently, the company launched solutions for laptops, cars, high-powered IoT devices and even servers. Today, ahead of MWC 2019, the company is officially launching two new products for cloud and edge applications, the Neoverse N1 and E1. Arm unveiled the Neoverse brand a few months ago, but it’s only now that it is taking concrete form with the launch of these new products.

“We’ve always been anticipating that this market is going to shift as we move more towards this world of lots of really smart devices out at the endpoint — moving beyond even just what smartphones are capable of doing,” Drew Henry, Arms’ SVP and GM for Infrastructure, told me in an interview ahead of today’s announcement. “And when you start anticipating that, you realize that those devices out of those endpoints are going to start creating an awful lot of data and need an awful lot of compute to support that.”

To address these two problems, Arm decided to launch two products: one that focuses on compute speed and one that is all about throughput, especially in the context of 5G.

ARM NEOVERSE N1

The Neoverse N1 platform is meant for infrastructure-class solutions that focus on raw compute speed. The chips should perform significantly better than previous Arm CPU generations meant for the data center and the company says that it saw speedups of 2.5x for Nginx and MemcacheD, for example. Chip manufacturers can optimize the 7nm platform for their needs, with core counts that can reach up to 128 cores (or as few as 4).

“This technology platform is designed for a lot of compute power that you could either put in the data center or stick out at the edge,” said Henry. “It’s very configurable for our customers so they can design how big or small they want those devices to be.”

The E1 is also a 7nm platform, but with a stronger focus on edge computing use cases where you also need some compute power to maybe filter out data as it is generated, but where the focus is on moving that data quickly and efficiently. “The E1 is very highly efficient in terms of its ability to be able to move data through it while doing the right amount of compute as you move that data through,” explained Henry, who also stressed that the company made the decision to launch these two different platforms based on customer feedback.

There’s no point in launching these platforms without software support, though. A few years ago, that would have been a challenge because few commercial vendors supported their data center products on the Arm architecture. Today, many of the biggest open-source and proprietary projects and distributions run on Arm chips, including Red Hat Enterprise Linux, Ubuntu, Suse, VMware, MySQL, OpenStack, Docker, Microsoft .Net, DOK and OPNFV. “We have lots of support across the space,” said Henry. “And then as you go down to that tier of languages and libraries and compilers, that’s a very large investment area for us at Arm. One of our largest investments in engineering is in software and working with the software communities.”

And as Henry noted, AWS also recently launched its Arm-based servers — and that surely gave the industry a lot more confidence in the platform, given that the biggest cloud supplier is now backing it, too.

Feb
19
2019
--

Google acquires cloud migration platform Alooma

Google today announced its intention to acquire Alooma, a company that allows enterprises to combine all of their data sources into services like Google’s BigQuery, Amazon’s Redshift, Snowflake and Azure. The promise of Alooma is that it handles the data pipelines and manages them for its users. In addition to this data integration service, though, Alooma also helps with migrating to the cloud, cleaning up this data and then using it for AI and machine learning use cases.

“Here at Google Cloud, we’re committed to helping enterprise customers easily and securely migrate their data to our platform,” Google VP of engineering Amit Ganesh and Google Cloud Platform director of product management Dominic Preuss write today. “The addition of Alooma, subject to closing conditions, is a natural fit that allows us to offer customers a streamlined, automated migration experience to Google Cloud, and give them access to our full range of database services, from managed open source database offerings to solutions like Cloud Spanner and Cloud Bigtable.”

Before the acquisition, Alooma had raised about $15 million, including an $11.2 million Series A round led by Lightspeed Venture Partners and Sequoia Capital in early 2016. The two companies did not disclose the price of the acquisition, but chances are we are talking about a modest price, given how much Alooma had previously raised.

Neither Google nor Alooma said much about what will happen to the existing products and customers — and whether it will continue to support migrations to Google’s competitors. We’ve reached out to Google and will update this post once we hear more.

Update. Here is Google’s statement about the future of the platform:

For now, it’s business as usual for Alooma and Google Cloud as we await regulatory approvals and complete the deal. After close, the team will be joining us in our Tel Aviv and Sunnyvale offices, and we will be leveraging the Alooma technology and team to provide our Google Cloud customers with a great data migration service in the future.

Regarding supporting competitors, yes, the existing Alooma product will continue to support other cloud providers. We will only be accepting new customers that are migrating data to Google Cloud Platform, but existing customers will continue to have access to other cloud providers.   
So going forward, Alooma will not accept any new customers who want to migrate data to any competitors. That’s not necessarily unsurprising and it’s good to see that Google will continue to support Alooma’s existing users. Those who use Alooma in combination with AWS, Azure and other non-Google services will likely start looking for other solutions soon, though, as this also means that Google isn’t likely to develop the service for them beyond its current state.

Alooma’s co-founders do stress, though, that “the journey is not over. Alooma has always aimed to provide the simplest and most efficient path toward standardizing enterprise data from every source and transforming it into actionable intelligence,” they write. “Joining Google Cloud will bring us one step closer to delivering a full self-service database migration experience bolstered by the power of their cloud technology, including analytics, security, AI, and machine learning.”

Oct
28
2018
--

Forget Watson, the Red Hat acquisition may be the thing that saves IBM

With its latest $34 billion acquisition of Red Hat, IBM may have found something more elementary than “Watson” to save its flagging business.

Though the acquisition of Red Hat  is by no means a guaranteed victory for the Armonk, N.Y.-based computing company that has had more downs than ups over the five years, it seems to be a better bet for “Big Blue” than an artificial intelligence program that was always more hype than reality.

Indeed, commentators are already noting that this may be a case where IBM finally hangs up the Watson hat and returns to the enterprise software and services business that has always been its core competency (albeit one that has been weighted far more heavily on consulting services — to the detriment of the company’s business).

Watson, the business division focused on artificial intelligence whose public claims were always more marketing than actually market-driven, has not performed as well as IBM had hoped and investors were losing their patience.

Critics — including analysts at the investment bank Jefferies (as early as one year ago) — were skeptical of Watson’s ability to deliver IBM from its business woes.

As we wrote at the time:

Jefferies pulls from an audit of a partnership between IBM Watson and MD Anderson as a case study for IBM’s broader problems scaling Watson. MD Anderson cut its ties with IBM after wasting $60 million on a Watson project that was ultimately deemed, “not ready for human investigational or clinical use.”

The MD Anderson nightmare doesn’t stand on its own. I regularly hear from startup founders in the AI space that their own financial services and biotech clients have had similar experiences working with IBM.

The narrative isn’t the product of any single malfunction, but rather the result of overhyped marketing, deficiencies in operating with deep learning and GPUs and intensive data preparation demands.

That’s not the only trouble IBM has had with Watson’s healthcare results. Earlier this year, the online medical journal Stat reported that Watson was giving clinicians recommendations for cancer treatments that were “unsafe and incorrect” — based on the training data it had received from the company’s own engineers and doctors at Sloan-Kettering who were working with the technology.

All of these woes were reflected in the company’s latest earnings call where it reported falling revenues primarily from the Cognitive Solutions business, which includes Watson’s artificial intelligence and supercomputing services. Though IBM chief financial officer pointed to “mid-to-high” single digit growth from Watson’s health business in the quarter, transaction processing software business fell by 8% and the company’s suite of hosted software services is basically an afterthought for business gravitating to Microsoft, Alphabet, and Amazon for cloud services.

To be sure, Watson is only one of the segments that IBM had been hoping to tap for its future growth; and while it was a huge investment area for the company, the company always had its eyes partly fixed on the cloud computing environment as it looked for areas of growth.

It’s this area of cloud computing where IBM hopes that Red Hat can help it gain ground.

“The acquisition of Red Hat is a game-changer. It changes everything about the cloud market,” said Ginni Rometty, IBM Chairman, President and Chief Executive Officer, in a statement announcing the acquisition. “IBM will become the world’s number-one hybrid cloud provider, offering companies the only open cloud solution that will unlock the full value of the cloud for their businesses.”

The acquisition also puts an incredible amount of marketing power behind Red Hat’s various open source services business — giving all of those IBM project managers and consultants new projects to pitch and maybe juicing open source software adoption a bit more aggressively in the enterprise.

As Red Hat chief executive Jim Whitehurst told TheStreet in September, “The big secular driver of Linux is that big data workloads run on Linux. AI workloads run on Linux. DevOps and those platforms, almost exclusively Linux,” he said. “So much of the net new workloads that are being built have an affinity for Linux.”

Oct
04
2017
--

Rasa Core kicks up the context for chatbots

 Context is everything when dealing with dialog systems. We humans take for granted how complex even our simplest conversations are. That’s part of the reason why dialog systems can’t live up to their human counterparts. But with an interactive learning approach and some open source love, Berlin-based Rasa is hoping to help enterprises solve their conversational AI problems. The… Read More

Sep
19
2017
--

Google Cloud’s Natural Language API gets content classification and more granular sentiment analysis

 Google Cloud announced two updates this morning to its Natural Language API. Specifically users will now have access to content classification and entity sentiment analysis. These features are particularly valuable for brands and media companies For starters, GCP users will now be able to tag content as corresponding with common topics like health, entertainment and law (cc: Henry).… Read More

Sep
18
2017
--

Ethereum will replace Visa in a ‘couple of years’ says founder

 The mind behind Ethereum, Vitalik Buterin, is matter-of-fact about the crypto. In short, he believes what interviewer Raval Navikant called “brain virus” is the true future of security and economics and, with the right incentives, Ethereum can replace things like credit card networks and even gaming servers. Buterin separates the world into two kinds of people. “There’s… Read More

Sep
18
2017
--

Matroid picks up $10M Series A to automate video stream monitoring

 As computer vision and object recognition technology continue to mature, we’re edging closer to automating away the exceedingly boring task of monitoring closed circuit TV cameras. Matroid is one of the startups leading the democratization of this variety of machine intelligence. The company is announcing a $10 million Series A this morning from NEA and Intel Capital that brings… Read More

Sep
15
2017
--

A typical day for researchers on Google’s Brain Team

 What do you and researchers on Google’s Brain Team have most in common? You both probably spend a lot of time triaging email. In a Reddit AMA, 11 Google AI researchers took time to share the activities that consume the greatest chunks of their days. Email was a frequent topic of conversation, in addition to less banal activities like skimming academic papers and brainstorming with… Read More

Sep
14
2017
--

Facebook is the latest tech giant to hunt for AI talent in Canada

 Facebook is turning its attention to Canada with a new AI research office in Montreal. Google and Microsoft already have outposts in the city and countless other tech companies, including Uber, have researchers based in Canada. McGill University’s Joelle Pineau will be leading Facebook’s AI efforts in Montreal. Pineau’s research focus tends to lean heavily on robotics and… Read More

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com