Apr
17
2019
--

Google Cloud brings on 27-year SAP veteran as it doubles down on enterprise adoption

Thomas Kurian, the newly minted CEO of Google Cloud, used the company’s Cloud Next conference last week to lay out his vision for the future of Google’s cloud computing platform. That vision involves, in part, a hiring spree to give businesses that want to work with Google more people to talk to and get help from. Unsurprisingly, Kurian is also looking to put his stamp on the executive team, too, and today announced that former SAP executive Robert Enslin is joining Google Cloud as its new president of Global Customer Operations.

Enslin’s hire is another clear signal that Kurian is focused on enterprise customers. Enslin, after all, is a veteran of the enterprise business, with 27 years at SAP, where he served on the company’s executive board until he announced his resignation from the company earlier this month. After leading various parts of SAP, including as president of its cloud product portfolio, president of SAP North America and CEO of SAP Japan, Enslin announced that he had “a few more aspirations to fulfill.” Those aspirations, we now know, include helping Google Cloud expand its lineup of enterprise customers.

“Rob brings great international experience to his role having worked in South Africa, Europe, Asia and the United States—this global perspective will be invaluable as we expand Google Cloud into established industries and growth markets around the world,” Kurian writes in today’s announcement.

For the last two years, Google Cloud already had a president of Global Customer Operations, though, in the form of Paul-Henri Ferrand, a former Dell exec who was brought on by Google Cloud’s former CEO Diane Greene . Kurian says that Ferrand “has decided to take on a new challenge within Google.”

Apr
16
2019
--

Google expands its container service with GKE Advanced

With its Kubernetes Engine (GKE), Google Cloud has long offered a managed service for running containers on its platform. Kubernetes users tend to have a variety of needs, but so far, Google only offered a single tier of GKE that wasn’t necessarily geared toward the high-end enterprise users the company is trying to woo. Today, however, the company announced a new advanced edition of GKE that introduces a number of new features and an enhanced financially backed SLA, additional security tools and new automation features. You can think of GKE Advanced as the enterprise version of GKE.

The new service will launch in the second quarter of the year and hasn’t yet announced pricing. The regular version of GKE is now called GKE Standard.

Google says the service builds upon the company’s own learnings from running a complex container infrastructure internally for years.

For enterprise customers, the financially backed SLA is surely a nice bonus. The promise here is 99.95 percent guaranteed availability for regional clusters.

Most users who opt for a managed Kubernetes environment do so because they don’t want to deal with the hassle of managing these clusters themselves. With GKE Standard, there’s still some work to be done with regard to scaling the clusters. Because of this, GKE Advanced includes a Vertical Pod Autoscaler that keeps on eye on resource utilization and adjusts it as necessary, as well as Node Auto Provisioning, an enhanced version of cluster autoscaling in GKE Standard.

In addition to these new GKE Advanced features, Google is adding GKE security features like the GKE Sandbox, which is currently in beta and will come exclusively to GKE Advanced once it’s launched, and the ability to enforce that only signed and verified images are used in the container environment.

The Sandbox uses Google’s gVisor container sandbox runtime. With this, every sandbox gets its own user-space kernel, adding an additional layer of security. With Binary Authorization, GKE Advanced users also can ensure that all container images are signed by a trusted authority before they are put into production. Somebody could theoretically still smuggle malicious code into the containers, but this process, which enforces standard container release practices, for example, should ensure that only authorized containers can run in the environment.

GKE Advanced also includes support for GKE usage metering, which allows companies to keep tabs on who is using a GKE cluster and charge them according. This feature, too, will be exclusive to GKE Advanced.

Apr
10
2019
--

With consumer G+ dead, Currents hopes to make waves in the enterprise

Google today announced that Google+ in G Suite, the last remaining remnants of what was once Google’s attempt to rival Facebook and Twitter, will now be called Currents. We don’t need to belabor the fact that Google+ was a flop and that its death was probably long overdue. We’ve done that. Now it’s time to look ahead and talk about what’s next for Currents. To do that, I sat down with David Thacker, the VP of Product Management for G Suite, at Google’s Cloud Next conference.

As Thacker told me, Google has shifted its resources to have the former Google+ team focus on Currents instead. But before we get to what that teams plans to do, let’s talk about the name first. Currents, after all, was also the name of the predecessor of Google Play Newsstand, the app that was the predecessor of the Google News app.

The official line is that “Currents” is meant to evoke the flow of information. Thacker also noted that the team did a lot of research around the name and that it had “very low recognition.” I guess that’s fair. It also allows Google to reuse an old trademark without having to jump through too many hoops. Since the Google+ name obviously now carries some baggage, changing the name makes sense anyway. “The enterprise version is distinct and separate now and it was causing confusion among our customers,” said Thacker.

“This allows us to do new things and move much faster in the enterprise,” Thacker explained. “To run a consumer social network at the scale of consumer G+ requires a lot of resources and efforts, as you can imagine. And that’s partially the reason we decided to sunset that product, as we just didn’t feel it was worth that investment given the user base on that. But it basically frees up that team to focus on the enterprise vision.”

Now, however, with consumer G+ gone, the company is going to invest in Currents. “We’re moving consumer resources into the enterprise,” he said.

The plan here clearly isn’t to just let Currents linger but to improve it for business users. And while Google has never publicly shared user numbers, Thacker argues that those businesses that do use it tend to use it expensively. The hope, though, surely, is to increase that number — whatever it may be — significantly over time. “If you look at our top G Suite customers, most of them use the product actively as a way to connect really broad organizations,” Thacker said.

Thacker also noted that this move now removes a lot of constraints since the team doesn’t have to think about consumer features anymore. “When Google+ was first designed, it was never designed for that [enterprise] use case, but organizations had the same need to break down silos and help spread ideas and knowledge in their company,” Thacker explained. “So while Google+ didn’t succeed as a consumer product, it will certainly live on in the enterprise.”

What will that future look like? As Thacker told me, the team started with revamping the posting workflow, which was heavily focused on image sharing, for example, which isn’t exactly all that important in a business context.

But there are other features the team is planning to launch, too, including better analytics. “Analytics is a really important part of it,” said Thacker. “When people are posting on Currents, whether it’s executives trying to engage their employee base, they want to see how that’s resonating. And so we built in some pretty rich analytics.”

The team also built a new set of administrative controls that help manage how organizations can control and manage their usage of Currents.

Going forward then, we may actually see a bit of innovation in Currents — something that was sorely lacking from Google+ while it was lingering in limbo. Google Cloud’s CEO Thomas Kurian told me that he wants to make collaboration one of his focus areas. Currents is an obvious fit there, and there are plenty of ways to integrate it with the rest of G Suite still.

Apr
10
2019
--

The right way to do AI in security

Artificial intelligence applied to information security can engender images of a benevolent Skynet, sagely analyzing more data than imaginable and making decisions at lightspeed, saving organizations from devastating attacks. In such a world, humans are barely needed to run security programs, their jobs largely automated out of existence, relegating them to a role as the button-pusher on particularly critical changes proposed by the otherwise omnipotent AI.

Such a vision is still in the realm of science fiction. AI in information security is more like an eager, callow puppy attempting to learn new tricks – minus the disappointment written on their faces when they consistently fail. No one’s job is in danger of being replaced by security AI; if anything, a larger staff is required to ensure security AI stays firmly leashed.

Arguably, AI’s highest use case currently is to add futuristic sheen to traditional security tools, rebranding timeworn approaches as trailblazing sorcery that will revolutionize enterprise cybersecurity as we know it. The current hype cycle for AI appears to be the roaring, ferocious crest at the end of a decade that began with bubbly excitement around the promise of “big data” in information security.

But what lies beneath the marketing gloss and quixotic lust for an AI revolution in security? How did AL ascend to supplant the lustrous zest around machine learning (“ML”) that dominated headlines in recent years? Where is there true potential to enrich information security strategy for the better – and where is it simply an entrancing distraction from more useful goals? And, naturally, how will attackers plot to circumvent security AI to continue their nefarious schemes?

How did AI grow out of this stony rubbish?

The year AI debuted as the “It Girl” in information security was 2017. The year prior, MIT completed their study showing “human-in-the-loop” AI out-performed AI and humans individually in attack detection. Likewise, DARPA conducted the Cyber Grand Challenge, a battle testing AI systems’ offensive and defensive capabilities. Until this point, security AI was imprisoned in the contrived halls of academia and government. Yet, the history of two vendors exhibits how enthusiasm surrounding security AI was driven more by growth marketing than user needs.

Apr
10
2019
--

Daily Crunch: Meet the new CEO of Google Cloud

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 9am Pacific, you can subscribe here.

1. Google Cloud’s new CEO on gaining customers, startups, supporting open source and more

Thomas Kurian, who came to Google Cloud after 22 years at Oracle, said the team is rolling out new contracts and plans to simplify pricing.

Most importantly, though, Google will go on a hiring spree: “A number of customers told us ‘we just need more people from you to help us.’ So that’s what we’ll do.”

2. Walmart to expand in-store tech, including Pickup Towers for online orders and robots

Walmart is doubling down on technology in its brick-and-mortar stores in an effort to better compete with Amazon. The retailer says it will add to its U.S. stores 1,500 new autonomous floor cleaners, 300 more shelf scanners, 1,200 more FAST Unloaders and 900 new Pickup Towers.

3. Udacity restructures operations, lays off 20 percent of its workforce

The objective is to do more than simply keep the company afloat, according to co-founder Sebastian Thrun. Instead, Thrun says these measures will allow Udacity to move from a money-losing operation to a “break-even or profitable company by next quarter and then moving forward.”

Photo By Bill Clark/CQ Roll Call via Getty Images

4. The government is about to permanently bar the IRS from creating a free electronic filing system

That’s right, members of Congress are working to prohibit a branch of the federal government from providing a much-needed service that would make the lives of all of their constituents much easier.

5. Here’s the first image of a black hole

Say hello to the black hole deep inside the Messier 87, a galaxy located in the Virgo cluster some 55 million light years away.

6. Movo grabs $22.5M to get more cities in LatAm scooting

The Spanish startup targets cities in its home market and in markets across Latin America, offering last-mile mobility via rentable electric scooters.

7. Uber, Lyft and the challenge of transportation startup profits

An article arguing that everything you know about the cost of transportation is wrong. (Extra Crunch membership required.)

Apr
10
2019
--

Google launches new security tools for G Suite users

Google today launched a number of security updates to G Suite, its online productivity and collaboration platform. The focus of these updates is on protecting a company’s data inside G Suite, both through controlling who can access it and through providing new tools for prevening phishing and malware attacks.

To do this, Google is announcing the beta launch of its advanced phishing and malware protection, for example. This is meant to help admins protect users from malicious attachment and inbound email spoofing, among other things.

The most interesting feature here, though, is the new security sandbox, another beta feature for G Suite enterprise users. The sandbox allows admins to add an extra layer of protection on top of the standard attachment scans for known viruses and malware. Those existing tools can’t fully protect you against zero-day ransomware or sophisticated malware, though. So instead of just letting you open the attachment, this tool executes the attachment in a sandbox environment to check if there are any security issues.

With today’s launch, Google is announcing the beta launch of its new security and alert center for admins. These tools are meant to create a single services that features best practice recommendations, but also a unified notifications center and tools to triage and take actions against threats, all with focus on collaboration among admins. Also new is a security investigation tool that mostly focuses on allowing admins to create automated workflows for sending notifications or assigning ownership to security investigations.

Apr
10
2019
--

Google launches its coldest storage service yet

At its Cloud Next conference, Google today launched a new archival cold storage service. This new service, which doesn’t seem to have a fancy name, will complement the company’s existing Nearline and Coldline services for storing vast amounts of infrequently used data at an affordable low cost.

The new archive class takes this one step further, though. It’s cheap, with prices starting at $0.0012 per gigabyte and month. That’s $1.23 per terabyte and month.

The new service will become available later this year.

What makes Google cold storage different from the likes of AWS S3 Glacier, for example, is that the data is immediately available, without millisecond latency. Glacier and similar service typically make you wait a significant amount of time before the data can be used. Indeed, in a thinly veiled swipe at AWS, Google directors of product management Dominic Preuss and Dave Nettleton note that “unlike tape and other glacially slow equivalents, we have taken an approach that eliminates the need for a separate retrieval process and provides immediate, low-latency access to your content.”

To put that into context, a gigabyte stored in AWS Glacier will set you back $0.004 per month. AWS offers another option, though: AWS Glacier Deep Archive. This service recently went live, at the cost of $0.00099 per gigabyte and month, though with significantly longer retrieval times.

Google’s new object storage service uses the same APIs as Google’s other storage classes and Google promises that the data is always redundantly stored across availability zones, with eleven 9’s of annual durability.

In a press conference ahead of today’s official announcement, Preuss noted that this service mostly a replacement for on-premise tape backups, but now that many enterprises try to keep as much data as they can to then later train their machine learning models, for example, the amounts of fresh data that needs to be stored for the long term continues to increase rapidly, too.

With low latency and the promise of high availability, there obviously has to be a drawback here, otherwise Google wouldn’t (and couldn’t) offer this service at this price. “Just like when you’re going from our standard [storage] class to Nearline or Coldline, there’s a committed amount of time that you have to remain in that class,” Preuss explained. “So basically, to get a lower price you are committing to keep the data in the Google Cloud Storage bucket for a period of time.”

Correction: a previous version of the post said that AWS Glacier Deep Archive wasn’t available yet when it actually went live two weeks ago. We changed the post to reflect this. 

Apr
10
2019
--

Salesforce and Google want to build a smarter customer service experience

Anyone who has dealt with bad customer service has felt frustration with the lack of basic understanding of who you are as a customer and what you need. Google and Salesforce feel your pain, and today the two companies expanded their partnership to try and create a smarter customer service experience.

The goal is to combine Salesforce’s customer knowledge with Google’s customer service-related AI products and build on the strengths of the combined solution to produce a better customer service experience, whether that’s with an agent or a chatbot..

Bill Patterson, executive vice president for Salesforce Service Cloud, gets that bad customer service is a source of vexation for many consumers, but his goal is to change that. Patterson points out that Google and Salesforce have been working together since 2017, but mostly on sales- and marketing-related projects. Today’s announcement marks the first time they are working on a customer service solution together.

For starters, the partnership is looking at the human customer service agent experience.”The combination of Google Contact Center AI, which highlights the language and the stream of intelligence that comes through that interaction, combined with the customer data and the business process information that that Salesforce has, really makes that an incredibly enriching experience for agents,” Patterson explained.

The Google software will understand voice and intent, and have access to a set of external information like weather or news events that might be having an impact on the customers, while Salesforce looks at the hard data it stores about the customer such as who they are, their buying history and previous interactions.

The companies believe that by bringing these two types of data together, they can surface relevant information in real time to help the agent give the best answer. It may be the best article or it could be just suggesting that a shipment might be late because of bad weather in the area.

Customer service agent screen showing information surfaced by intelligent layers in Google and Salesforce

The second part of the announcement involves improving the chatbot experience. We’ve all dealt with rigid chatbots, who can’t understand your request. Sure, it can sometimes channel your call to the right person, but if you have any question outside the most basic ones, it tends to get stuck, while you scream “Operator! I said OPERATOR!” (Or at least I do.)

Google and Salesforce are hoping to change that by bringing together Einstein, Salesforce’s artificial intelligence layer and Google Natural Language Understanding (NLU) in its Google Dialogflow product to better understand the request, monitor the sentiment and direct you to a human operator before you get frustrated.

Patterson’s department, which is on a $3.8 billion run rate, is poised to become the largest revenue producer in the Salesforce family by the end of the year. The company itself is on a run rate over $14 billion.

“So many organizations just struggle with primitives of great customer service and experience. We have a lot of passion for making everyday interaction better with agents,” he said. Maybe this partnership will bring some much needed improvement.

Apr
10
2019
--

Google makes the power of BigQuery available in Sheets

Google today announced a new service that makes the power of BigQuery, its analytics data warehouse, available in Sheets, its web-based spreadsheet tool. These so-called “connected sheets” face none of the usual limitations of Google’s regular spreadsheets, meaning there are no row limits, for example. Instead, users can take a massive data set from BigQuery, with potentially billions of rows, and turn it into a pivot table.

The idea here, is to enable virtually anybody to make use of all the data that is stored in BigQuery. That’s because from the user’s perspective, this new kind of table is simply a spreadsheet, with all of the usual functionality you’d expect from a spreadsheet. With this, Sheets becomes a front end for BigQuery — and virtually any business user knows how to use a spreadsheet.

This also means you can use all of the usual visualization tools in Sheets and share your data with others in your organization.

“Connected sheets are helping us democratize data,” says Nikunj Shanti, chief product officer at AirAsia. “Analysts and business users are able to create pivots or charts, leveraging their existing skills on massive data sets, without needing SQL. This direct access to the underlying data in BigQuery provides access to the most granular data available for analysis. It’s a game changer for AirAsia.”

The beta of connected sheets should go live within the next few months.

In this context, it’s worth mentioning that Google also today announced the beta launch of BigQuery BI Engine, a new service for business users that connects BigQuery with Google Data Studio for building interactive dashboards and reports. This service, too, is available in Google Data Studio today and will also become available through third-party services like Tableau and Looker in the next few months.

“With BigQuery BI Engine behind the scenes, we’re able to gain deep insights very quickly in Data Studio,” says Rolf Seegelken, senior data analyst at Zalando. “The performance of even our most computationally intensive dashboards has sped up to the point where response times are now less than a second. Nothing beats ‘instant’ in today’s age, to keep our teams engaged in the data!”

Apr
10
2019
--

What’s left of Google+ is now called Currents

Google+ for consumers is officially dead, but it’s still alive for enterprise users. Only a few days after completely shutting down the public version of Google+, Google today announced that it is giving the enterprise version a new name. It’s now called Currents.

If that name sounds familiar, it’s because Google once offered another service called Currents, a social magazine app with Google+ integrations that was later replaced by Google Play Newsstand. That history clearly bodes well for the new Currents.

Like before, Google+/Currents is meant to give employees a place to share knowledge and provide them with a place for internal discussions.

Google is probably doing the right thing by completely eliminating the Google+ moniker. The fact that there was still a version of Google+ for the enterprise created a bit of confusion when it announced the shutdown of the consumer version. Maybe this move will also allow the remaining developers on the project to leave the failed legacy of Google+ behind and try something new. As the only focus is now on business users, that should be fairly easy, even though the code base surely still reflects a time when Google’s leadership thought that social search was the future.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com