Jul
23
2018
--

Google Cloud’s partnership network begins paying dividends

When Google Cloud brought Diane Greene on board at the end of 2015, one of her goals was to expand the division’s partnership network, an approach she found worked quite well when she was running VMware in the early 2000s. It appears to be working at Google too.

This week at Google Next, the company’s annual cloud conference, they announced the partner program had grown significantly since the beginning of last year. “Since the start of 2017, we’ve increased the number of technology partners by 10x and we’ve more than doubled our team supporting these partners,” Google’s Nan Boden and Nina Harding wrote in a blog post on partner program progress.

Google is partnering with a variety of large enterprise vendors from Cisco to SAP to NetApp to Diane Greene’s old company, VMware. In addition, they are also working with the traditional systems integrators like Accenture, Deloitte, KPMG and others.

All of this is enabling Google Cloud customers to work through familiar channels while helping Google to build out its cloud business and gain more traction in the enterprise. Partners in general help customers work with a platform like Google Cloud more easily by providing integrations that might not otherwise exist.

One thing Google has going for it, especially on the G Suite side of the house, which includes Gmail, Docs, Drive and Calendar, is sheer numbers with millions of users. It benefits the partner to work with a company like Google Cloud to help all their common users, and perhaps attract new ones, and it benefits Google because it makes their cloud services all the more valuable to the customer.

The company sees Software as a Service in particular as a key area for growth and they announced out a new partnership program this week with access to more Google personnel and marketing funding to help encourage more interaction with SaaS partners on the platform. They already have multiple agreements in place with popular SaaS vendors including Salesforce, Box, MongoDB, Zenoss, Elastic, RedisLabs, JFrog, BetterCloud, DialPad, and many others

Cloud computing has always been different from traditional enterprise computing because cooperation has always been the watch word. Even companies like Salesforce and DialPad and Cisco and SAP that could be competing with Google on some levels see the benefits of working with them (and other cloud providers). It’s what their customers want, and cooperation when it makes sense, benefits all parties involved.

Jul
17
2018
--

Google’s new ‘Grab and Go’ project helps business loan Chromebooks to their employees

At Google, the company offers a ‘Grab and Go’ program that allows employees to use self-service stations to quickly borrow and return Chromebooks without having to go through a lengthy IT approval process. Now, it’s bringing this same idea to other businesses.

Chromebooks have found their place in education and a number of larger enterprise companies are also getting on board with the idea of a centrally managed device that mostly focuses on the browser. That’s maybe no surprise, given that both schools and enterprises are pretty much looking for the same thing from these devices.

At Google, the system has seen more than 30,000 users that have completed more than 100,000 loans so far.

While Google wants others to run similar programs (and use more Chromebooks in the process) it’s worth noting that this is a limited preview program and that Google isn’t building and selling racks or other infrastructure for this. As a Google spokesperson told us, Google will give companies that want to try this the open source code to build this system and advise them through the setup and deployment. It will also engage with partners to help them build the hardware or set up a ‘Grab and Go’ as a service system.

Employees who want to use one of these ‘Grab and Go’ stations simply pick up a laptop, sign in and move on with their day. When they are done, they simply return the laptop. That’s it. Easy.

That’s not quite as exciting as Google building and selling racks of Chromebooks, but this project is clearly another move to bring Chromebooks to the enterprise. Specifically, Google says that this program is meant for frontline workers who only need devices for a short period of time, as well as shift workers and remote workers.

Jul
12
2018
--

Google’s Apigee teams up with Informatica to extend its API ecosystem

Google acquired API management service Apigee back in 2016, but it’s been pretty quiet around the service in recent years. Today, however, Apigee announced a number of smaller updates that introduce a few new integrations with the Google Cloud platform, as well as a major new partnership with cloud data management and integration firm Informatica that essentially makes Informatica the preferred integration partner for Google Cloud.

Like most partnerships in this space, the deal with Informatica involves some co-selling and marketing agreements, but that really wouldn’t be all that interesting. What makes this deal stand out is that Google is actually baking some of Informatica’s tools right into the Google Cloud dashboard. This will allow Apigee users to use Informatica’s wide range of integrations with third-party enterprise applications while Informatica users will be able to publish their APIs through Apigee and have that service manage them for them.

Some of Google’s competitors, including Microsoft, have built their own integration services. As Google Cloud director of product management Ed Anuff told me, that wasn’t really on Google’s road map. “It takes a lot of know-how to build a rich catalog of connectors,” he said. “You could go and build an integration platform but if you don’t have that, you can’t address your customer’s needs.” Instead, Google went to look for a partner who already has this large catalog and plenty of credibility in the enterprise space.

Similarly, Informatica’s senior VP and GM for big data, cloud and data integration Ronen Schwartz noted that many of his company’s customers are now looking to move into the cloud and this move will make it easier for Informatica’s customers to bring their services into Apigee and open them up for external applications. “With this partnership, we are bringing the best of breed of both worlds to our customers,” he said. “And we are doing it now and we are making it available in an integrated, optimized way.”

Jul
03
2018
--

Google Cloud’s COO departs after 7 months

At the end of last November, Google announced that Diane Bryant, who at the time was on a leave of absence from her position as the head of Intel’s data center group, would become Google Cloud’s new COO. This was a major coup for Google, but it wasn’t meant to last. After only seven months on the job, Bryant has left Google Cloud, as Business Insider first reported today.

“We can confirm that Diane Bryant is no longer with Google. We are grateful for the contributions she made while at Google and we wish her the best in her next pursuit,” a Google spokesperson told us when we reached out for comment.

The reasons for Bryant’s departure are currently unclear. It’s no secret that Intel is looking for a new CEO and Bryant would fit the bill. Intel also famously likes to recruit insiders as its leaders, though I would be surprised if the company’s board had already decided on a replacement. Bryant spent more than 25 years at Intel and her hire at Google looked like it would be a good match, especially given that Google’s position behind Amazon and Microsoft in the cloud wars means that it needs all the executive talent it can get.

When Bryant was hired, Google Cloud CEO Diane Greene noted that “Diane’s strategic acumen, technical knowledge and client focus will prove invaluable as we accelerate the scale and reach of Google Cloud.” According to the most recent analyst reports, Google Cloud’s market share has ticked up a bit — and its revenue has increased at the same time — but Google remains a distant third in the competition and it doesn’t look like that’s changing anytime soon.

Jun
26
2018
--

With Cloud Filestore, the Google Cloud gets a new storage option

Google is giving developers a new storage option in its cloud. Cloud Filestore, which will launch into beta next month, essentially offers a fully managed network attached storage (NAS) service in the cloud. This means that companies can now easily run applications that need a traditional file system interface on the Google Cloud Platform.

Traditionally, developers who wanted access to a standard file system over the kind of object storage and database options that Google already offered had to rig up a file server with a persistent disk. Filestore does away with all of this and simply allows Google Cloud users to spin up storage as needed.

The promise of Filestore is that it offers high throughput, low latency and high IOPS. The service will come in two tiers: premium and standard. The premium tier will cost $0.30 per GB and month and promises a throughput speed of 700 MB/s and 30,000 IOPS, no matter the storage capacity. Standard-tier Filestore storage will cost $0.20 per GB and month, but performance scales with capacity and doesn’t hit peak performance until you store more than 10TB of data in Filestore.

Google launched Filestore at an event in Los Angeles that mostly focused on the entertainment and media industry. There are plenty of enterprise applications in those verticals that need a shared file system, but the same can be said for many other industries that rely on similar enterprise applications.

The Filestore beta will launch next month. Because it’s still in beta, Google isn’t making any uptime promises right now and there is no ETA for when the service will come out of beta.

Jun
19
2018
--

Google injects Hire with AI to speed up common tasks

Since Google Hire launched last year it has been trying to make it easier for hiring managers to manage the data and tasks associated with the hiring process, while maybe tweaking LinkedIn while they’re at it. Today the company announced some AI-infused enhancements that they say will help save time and energy spent on manual processes.

“By incorporating Google AI, Hire now reduces repetitive, time-consuming tasks, like scheduling interviews into one-click interactions. This means hiring teams can spend less time with logistics and more time connecting with people,” Google’s Berit Hoffmann, Hire product manager wrote in a blog post announcing the new features.

The first piece involves making it easier and faster to schedule interviews with candidates. This is a multi-step activity that involves scheduling appropriate interviewers, choosing a time and date that works for all parties involved in the interview and scheduling a room in which to conduct the interview. Organizing these kind of logistics tend to eat up a lot of time.

“To streamline this process, Hire now uses AI to automatically suggest interviewers and ideal time slots, reducing interview scheduling to a few clicks,” Hoffmann wrote.

Photo: Google

Another common hiring chore is finding keywords in a resume. Hire’s AI now finds these words for a recruiter automatically by analysing terms in a job description or search query and highlighting relevant words including synonyms and acronyms in a resume to save time spent manually searching for them.

Photo: Google

Finally, another standard part of the hiring process is making phone calls, lots of phone calls. To make this easier, the latest version of Google Hire has a new click-to-call function. Simply click the phone number and it dials automatically and registers the call in call a log for easy recall or auditing.

While Microsoft has LinkedIn and Office 365, Google has G Suite and Google Hire. The strategy behind Hire is to allow hiring personnel to work in the G Suite tools they are immersed in every day and incorporate Hire functionality within those tools.

It’s not unlike CRM tools that integrate with Outlook or GMail because that’s where sales people spend a good deal of their time anyway. The idea is to reduce the time spent switching between tools and make the process a more integrated experience.

While none of these features individually will necessarily wow you, they are making use of Google AI to simplify common tasks to reduce some of the tedium associated with every-day hiring tasks.

Jun
13
2018
--

Salesforce deepens data sharing partnership with Google

Last Fall at Dreamforce, Salesforce announced a deepening friendship with Google . That began to take shape in January with integration between Salesforce CRM data and Google Analytics 360 and Google BigQuery. Today, the two cloud giants announced the next step as the companies will share data between Google Analytics 360 and the Salesforce Marketing Cloud.

This particular data sharing partnership makes even more sense as the companies can share web analytics data with marketing personnel to deliver ever more customized experiences for users (or so the argument goes, right?).

That connection certainly didn’t escape Salesforce’s VP of product marketing, Bobby Jania. “Now, marketers are able to deliver meaningful consumer experiences powered by the world’s number one marketing platform and the most widely adopted web analytics suite,” Jania told TechCrunch.

Brent Leary, owner of the consulting firm CRM Essentials says the partnership is going to be meaningful for marketers. “The tighter integration is a big deal because a large portion of Marketing Cloud customers are Google Analytics/GA 360 customers, and this paves the way to more seamlessly see what activities are driving successful outcomes,” he explained.

The partnership involves four integrations that effectively allow marketers to round-trip data between the two platforms. For starters, consumer insights from both Marketing Cloud and Google Analytics 360, will be brought together into a single analytics dashboard inside Marketing Cloud. Conversely, Market Cloud data will be viewable inside Google Analytics 360 for attribution analysis and also to use the Marketing Cloud information to deliver more customized web experiences. All three of these integrations will be generally available starting today.

A fourth element of the partnership being announced today won’t be available in Beta until the third quarter of this year. “For the first time ever audiences created inside the Google Analytics 360 platform can be activated outside of Google. So in this case, I’m able to create an audience inside of Google Analytics 360 and then I’m able to activate that audience in Marketing Cloud,” Jania explained.

An audience is like a segment, so if you have a group of like-minded individuals in the Google analytics tool, you can simply transfer it to Salesforce Marketing Cloud and send more relevant emails to that group.

This data sharing capability removes a lot of the labor involved in trying to monitor data stored in two places, but of course it also raises questions about data privacy. Jania was careful to point out that the two platforms are not sharing specific information about individual consumers, which could be in violation of the new GDPR data privacy rules that went into effect in Europe at the end of last month.

“What we’re [we’re sharing] is either metadata or aggregated reporting results. Just to be clear there’s no personal identifiable data that is flowing between the systems so everything here is 100% GDPR-compliant,” Jania said.

But Leary says it might not be so simple, especially in light of recent data sharing abuses. “With Facebook having to open up about how they’re sharing consumer data with other organizations, companies like Salesforce and Google will have to be more careful than ever before about how the consumer data they make available to their corporate customers will be used by them. It’s a whole new level of scrutiny that has to be apart of the data sharing equation,” Leary said.

The announcements were made today at the Salesforce Connections conference taking place in Chicago this week.

Jun
07
2018
--

Google Cloud announces the Beta of single tenant instances

One of the characteristics of cloud computing is that when you launch a virtual machine, it gets distributed wherever it makes the most sense for the cloud provider. That usually means sharing servers with other customers in what is known as a multi-tenant environment. But what about times when you want a physical server dedicated just to you?

To help meet those kinds of demands, Google announced the Beta of Google Compute Engine Sole-tenant nodes, which have been designed for use cases such a regulatory or compliance where you require full control of the underlying physical machine, and sharing is not desirable.

“Normally, VM instances run on physical hosts that may be shared by many customers. With sole-tenant nodes, you have the host all to yourself,” Google wrote in a blog post announcing the new offering.

Diagram: Google

Google has tried to be as flexible as possible, letting the customer choose exactly what configuration they want in terms CPU and memory. Customers can also let Google choose the dedicated server that’s best at any particular moment, or you can manually select the server if you want that level of control. In both cases, you will be assigned a dedicated machine.

If you want to play with this, there is a free tier and then various pricing tiers for a variety of computing requirements. Regardless of your choice, you will be charged on a per-second basis with a one-minute minimum charge, according to Google.

Since this feature is still in Beta, it’s worth noting that it is not covered under any SLA. Microsoft and Amazon have similar offerings.

Jun
06
2018
--

Four years after its release, Kubernetes has come a long way

On June 6th, 2014 Kubernetes was released for the first time. At the time, nobody could have predicted that 4 years later that the project would become a de facto standard for container orchestration or that the biggest tech companies in the world would be backing it. That would come later.

If you think back to June 2014, containerization was just beginning to take off thanks to Docker, which was popularizing the concept with developers, but being so early there was no standard way to manage those containers.

Google had been using containers as a way to deliver applications for years and ran a tool called Borg to handle orchestration. It’s called an orchestrator because much like a conductor of an orchestra, it decides when a container is launched and when it shuts down once it’s completed its job.

At the time, two Google engineers, Craig McLuckie and Joe Beda, who would later go on to start Heptio, were looking at developing an orchestration tool like Borg for companies that might not have the depth of engineering talent of Google to make it work. They wanted to spread this idea of how they develop distributed applications to other developers.

Hello world

Before that first version hit the streets, what would become Kubernetes developed out of a need for an orchestration layer that Beda and McLuckie had been considering for a long time. They were both involved in bringing Google Compute Engine, Google’s Infrastructure as a Service offering, to market, but they felt like there was something missing in the tooling that would fill in the gaps between infrastructure and platform service offerings.

“We had long thought about trying to find a way to bring a sort of a more progressive orchestrated way of running applications in production. Just based on our own experiences with Google Compute Engine, we got to see firsthand some of the challenges that the enterprise faced in moving workloads to the cloud,” McLuckie explained.

He said that they also understood some of the limitations associated with virtual machine-based workloads and they were thinking about tooling to help with all of that. “And so we came up the idea to start a new project, which ultimately became Kubernetes.”

Let’s open source it

When Google began developing Kubernetes in March 2014, it wanted nothing less than to bring container orchestration to the masses. It was a big goal and McLuckie, Beda and teammate Brendan Burns believed the only way to get there was to open source the technology and build a community around it. As it turns out, they were spot on with that assessment, but couldn’t have been 100 percent certain at the time. Nobody could have.

Photo: Cloud Native Computing Foudation

“If you look at the history, we made the decision to open source Kubernetes and make it a community-oriented project much sooner than conventional wisdom would dictate and focus on really building a community in an open and engaged fashion. And that really paid dividends as Kubernetes has accelerated and effectively become the standard for container orchestration,” McLuckie said.

The next thing they did was to create the Cloud Native Computing Foundation (CNCF) as an umbrella organization for the project. If you think about it, this project could have gone in several directions, as current CNCF director Dan Kohn described in a recent interview.

Going cloud native

Kohn said Kubernetes was unique in a couple of ways. First of all, it was based on existing technology developed over many years at Google. “Even though Kubernetes code was new, the concepts and engineering and know-how behind it was based on 15 years at Google building Borg (And a Borg replacement called Omega that failed),” Kohn said. The other thing was that Kubernetes was designed from the beginning to be open sourced.

Photo: Swapnil Bhartiya on Flickr. Used under CC by SA 2.0 license

He pointed out that Google could have gone in a few directions with Kubernetes. It could have created a commercial product and sold it through Google Cloud. It could have open sourced it, but had a strong central lead as they did with Go. They could have gone to the Linux Foundation and said they wanted to create a stand-alone Kubernetes Foundation. But they didn’t do any of these things.

McLuckie says they decided to something entirely different and place it under the auspices of the Linux Foundation, but not as Kubernetes project. Instead they wanted to create a new framework for cloud native computing itself and the CNCF was born. “The CNCF is a really important staging ground, not just for Kubernetes, but for the technologies that needed to come together to really complete the narrative, to make Kubernetes a much more comprehensive framework,” McLuckie explained.

Getting everyone going in the same direction

Over the last few years, we have watched as Kubernetes has grown into a container orchestration standard. Last summer in quick succession  a slew of major enterprise players joined CNCF as AWSOracleMicrosoftVMware and Pivotal all joined. They came together with Red Hat, Intel, IBM Cisco and others who were already members.

Cloud Native Computing Foundation Platinum members

Each these players no doubt wanted to control the orchestration layer, but they saw Kubernetes gaining momentum so rapidly, they had little choice but to go along. Kohn jokes that having all these big name players on board is like herding cats, but bringing in them in has been the goal all along. He said it just happened much faster than he thought it would.

In a recent interview with TechCrunch, David Aronchick, who runs the open source Kubeflow Kubernetes machine learning project at Google, was running Kubernetes in the early days. He is shocked by how quickly it has grown. “I couldn’t have predicted it would be like this. I joined in January, 2015 and took on project management for Google Kubernetes. I was stunned at the pent up demand for this kind of thing,” he told TechCrunch.

As it has grown, it has become readily apparent that McLuckie was right about building that cloud native framework instead of a stand-alone Kubernetes foundation. Today there are dozens of adjacent projects and the organization is thriving.

Nobody is more blown away by this than McLuckie himself who says seeing Kubernetes hit these various milestones since its initial release has been amazing for him and his team to watch. “It’s just been a series of these wonderful kind of moments as Kubernetes has gained a head of steam, and it’s been  so much fun to see the community really rally around it.”

Jun
04
2018
--

The new Gmail will roll out to all users next month

Google today announced that the new version of Gmail will launch into general availability and become available to all G Suite users next month. The exact date remains up in the air but my guess is that it’ll be sooner than later.

The new Gmail offers features like message snoozing, attachment previews, a sidebar for both Google apps like Calendar and third-party services like Trello, offline support, confidential messages that self-destruct after a set time, and more. It’s also the only edition of Gmail that currently allows you to try out Smart Compose, which tries to complete your sentences for you.

Here is what the rollout will look like for G Suite users. Google didn’t detail what the plan for regular users will look like, but if you’re not a G Suite user, you can already try the new Gmail today anyway and chances are stragglers will also get switched over to the new version at a similar pace as G Suite users.

Starting in July, G Suite admins will be able to immediately transition all of their users to the new Gmail, but users can still opt out for another twelve weeks. After that time is up, all G Suite users will move to the new Gmail experience.

Admins can also give users the option to try the new Gmail at their own pace or — and this is the default setting — they can just wait another four weeks and then Google will automatically give users the option to opt in.

Eight weeks after general availability, so sometime in September, all users will be migrated automatically but can still opt out for another four weeks.

That all sounds a bit more complicated than necessary, but the main gist here is: chances are you’ll get access to the new Gmail next month and if you hate it, you can still opt out for a bit longer. Then, if you still hate it, you are out of luck because come October, you will be using the new Gmail no matter what.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com