Dec
17
2018
--

Google will make it easier for people without accounts to collaborate on G Suite documents

Soon it will be easier for people without Google accounts to collaborate on G Suite documents. Currently in beta, a new feature will enable G Suite users to invite people without G Suite subscriptions or Google accounts to work on files by sending them a pin code.

Using the pin code to gain access allows invitees to view, comment on, suggest edits to or directly edit Google Docs, Sheets and Slides. The owners and admins of the G Suite files monitor usage through activity logs and can revoke access at any time. According to the feature’s support article, admins are able to set permissions by department or domain. They also can restrict sharing outside of white-listed G Suite domains or their own organization.

In order to sign up for the beta program, companies need to fill in this form and select a non-G Suite domain they plan to collaborate with frequently.

According to a Reuters article published in February, since intensifying their focus on enterprise customers, Google has doubled the number of organizations with a G Suite subscription to more than 4 million. But despite Google’s efforts to build its enterprise user base, G Suite hasn’t come close to supplanting Office 365 as the cloud-based productivity software of choice for companies.

Office 365 made $13.8 billion in sales in 2016, versus just $1.3 billion for G Suite, according to Gartner. Google has added features to G Suite, however, to make the two competing software suites more interoperable, including an update that enables Google Drive users to comment on Office files, PDFs and images in the Drive preview panel without needing to convert them to Google Docs, Sheets or Slide files first, even if they don’t have Microsoft Office or Acrobat Reader. Before that, Google also released a Drive plugin for Outlook.

This may not convince Microsoft customers to switch, especially if they have been using its software for decades, but at least it will get more workers comfortable with Google’s alternatives, and may convince some companies to subscribe to G Suite for at least some employees or departments.

Dec
11
2018
--

The Cloud Native Computing Foundation adds etcd to its open-source stable

The Cloud Native Computing Foundation (CNCF), the open-source home of projects like Kubernetes and Vitess, today announced that its technical committee has voted to bring a new project on board. That project is etcd, the distributed key-value store that was first developed by CoreOS (now owned by Red Hat, which in turn will soon be owned by IBM). Red Hat has now contributed this project to the CNCF.

Etcd, which is written in Go, is already a major component of many Kubernetes deployments, where it functions as a source of truth for coordinating clusters and managing the state of the system. Other open-source projects that use etcd include Cloud Foundry, and companies that use it in production include Alibaba, ING, Pinterest, Uber, The New York Times and Nordstrom.

“Kubernetes and many other projects like Cloud Foundry depend on etcd for reliable data storage. We’re excited to have etcd join CNCF as an incubation project and look forward to cultivating its community by improving its technical documentation, governance and more,” said Chris Aniszczyk, COO of CNCF, in today’s announcement. “Etcd is a fantastic addition to our community of projects.”

Today, etcd has well over 450 contributors and nine maintainers from eight different companies. The fact that it ended up at the CNCF is only logical, given that the foundation is also the host of Kubernetes. With this, the CNCF now plays host to 17 projects that fall under its “incubated technologies” umbrella. In addition to etcd, these include OpenTracing, Fluentd, Linkerd, gRPC, CoreDNS, containerd, rkt, CNI, Jaeger, Notary, TUF, Vitess, NATS Helm, Rook and Harbor. Kubernetes, Prometheus and Envoy have already graduated from this incubation stage.

That’s a lot of projects for one foundation to manage, but the CNCF community is also extraordinarily large. This week alone about 8,000 developers are converging on Seattle for KubeCon/CloudNativeCon, the organization’s biggest event yet, to talk all things containers. It surely helps that the CNCF has managed to bring competitors like AWS, Microsoft, Google, IBM and Oracle under a single roof to collaboratively work on building these new technologies. There is a risk of losing focus here, though, something that happened to the OpenStack project when it went through a similar growth and hype phase. It’ll be interesting to see how the CNCF will manage this as it brings on more projects (with Istio, the increasingly popular service mesh, being a likely candidate for coming over to the CNCF as well).

Nov
16
2018
--

Former Oracle exec Thomas Kurian to replace Diane Greene as head of Google Cloud

Diane Greene announced in a blog post today that she would be stepping down as CEO of Google Cloud and will be helping transition former Oracle executive Thomas Kurian to take over early next year.

Greene took over the position almost exactly three years ago when Google bought Bebop, the startup she was running. The thinking at the time was that the company needed someone with a strong enterprise background and Greene, who helped launch VMware, certainly had the enterprise credentials they were looking for.

In the blog post announcing the transition, she trumpeted her accomplishments. “The Google Cloud team has accomplished amazing things over the last three years, and I’m proud to have been a part of this transformative work. We have moved Google Cloud from having only two significant customers and a collection of startups to having major Fortune 1000 enterprises betting their future on Google Cloud, something we should accept as a great compliment as well as a huge responsibility,” she wrote.

The company had a disparate set of cloud services when she took over, and one of the first things Greene did was to put them all under a single Google Cloud umbrella. “We’ve built a strong business together — set up by integrating sales, marketing, Google Cloud Platform (GCP), and Google Apps/G Suite into what is now called Google Cloud,” she wrote in the blog post.

As for Kurian, he stepped down as president of product development at Oracle at the end of September. He had announced a leave of absence earlier in the month before making the exit permanent. Like Greene before him, he brings a level of enterprise street cred, which the company needs as it continues to try to grow its cloud business.

After three years with Greene at the helm, Google, which has tried to position itself as the more open cloud alternative to Microsoft and Amazon, has still struggled to gain market share against its competitors, remaining under 10 percent consistently throughout Greene’s tenure.

As Synergy’s John Dinsdale told TechCrunch in an article on Google Cloud’s strategy in 2017, the company had not been particularly strong in the enterprise to that point. “The issues of course are around it being late to market and the perception that Google isn’t strong in the enterprise. Until recently Google never gave the impression (through words or deeds) that cloud services were really important to it. It is now trying to make up for lost ground, but AWS and Microsoft are streets ahead,” Dinsdale explained at the time. Greene was trying hard to change that perception.

Holger Mueller, an analyst at Constellation Research says Greene was able to shift the focus to enterprise more, but he likes what Kurian brings to the table, even if it will take a bit of a cultural shift from his many years at Oracle. “What Greene did not address has been how to tie the product portfolio of Google’s autonomous and disparate development teams together. Kurian is a great fit for that job, having lead 35k+ developers at Oracle, ending the trench warfare between product teams and divisions that has plagued Oracle a decade ago,” Mueller explained.

Google has not released many revenue numbers related to the cloud, but in February it indicated they were earning a billion dollars a quarter, a number that Greene felt put Google in elite company. Amazon and Google were reporting numbers like that for a quarter at the time. Google stopped reporting cloud revenue after that report.

Regardless, the company will turn to Kurian to continue growing those numbers now. “I will continue as CEO through January, working with Thomas to ensure a smooth transition. I will remain a Director on the Alphabet board,” Greene wrote in her blog post.

Interestingly enough, Oracle has struggled with its own transition to the cloud. Kurian gets a company that was born in the cloud, rather than one that has made a transition from on-prem software and hardware to one solely in the cloud. It will be up to him to steer Google Cloud moving forward.

Oct
29
2018
--

Google beefs up Firebase platform for the enterprise

Today at the Firebase Summit in Prague, Google announced a number of updates to its Firebase app development platform designed to help it shift from an environment for individuals or small teams into a full-blown enterprise development tool.

Google acquired Firebase 4 years ago to help developers connect to key cloud tools like a database or storage via a set of software development kits (SDKs). Over time, it has layered on sophisticated functionality like monitoring to fix performance issues and access to analytics to see how users are engaging with the app, among other things. But the toolkit hasn’t necessarily been geared towards larger organizations until now.

“[Today’s announcements] are going to be around a set of features and updates that are catered more towards enterprises and sophisticated app teams that are looking to build and grow their mobile apps,” Francis Ma, head of product at Firebase told TechCrunch.

Perhaps the biggest piece of news was that they were adding corporate support. While the company boasts 1.5 million apps per month running on Firebase, in order to move deeper into the enterprise, it needed to have a place corporate IT could call when they run into issues. That is coming with the company expected to announce various support packages in Beta by the end of the year. These will be tied to broader Google Cloud Platform support.

“With this launch, if you already have a paid GCP Support package, you will be able to get your Firebase questions answered through the Google Cloud Platform (GCP) Support Console. Once the change is fully launched, Firebase support will be included at no additional charge with paid GCP Support packages, which includes target response times, a dedicated technical account manager (if you are enrolled in Enterprise Support) and more,” Ma explained in a blog post.

In addition, larger teams and organizations need more management tools and the company announced the Firebase Management API. This allows programmatic access to manage project workflows from IDE to Firebase. Ma says this includes direct integration with StackBlitz and Glitch, two web-based IDEs. “Their platforms will now automatically detect when you are creating a Firebase app and allow you to deploy to Firebase Hosting with the click of a button, without ever leaving their platforms,” Ma wrote.

There were a bushel of other announcements including access to better facial recognition tools in the Google ML kit announced last spring. There were also improvements to Crashlytics performance monitoring, which includes integration with PagerDuty now, and Firebase Predictions, its analytics tool, which is now generally available after graduating from Beta.

All of these announcements and more, are part of a maturation of the Firebase platform as Google aims to move it from a tool aimed directly at developers to one that can be integrated at the enterprise level.

Oct
10
2018
--

Google+ for G Suite lives on and gets new features

You thought Google+ was dead, didn’t you? And it is — if you’re a consumer. But the business version of Google’s social network will live on for the foreseeable future — and it’s getting a bunch of new features today.

Google+ for G Suite isn’t all that different from the Google+ for consumers, but its focus is very much on allowing users inside a company to easily share information. Current users include the likes of Nielsen and French retailer Auchan.

The new features that Google is announcing today give admins more tools for managing and reviewing posts, allow employees to tag content and provide better engagement metrics to posters.

Recently Google introduced the ability for admins to bulk-add groups of users to a Google+ community, for example. And soon, those admins will be able to better review and moderate posts made by their employees. Soon, admins will also be able to define custom streams so that employees could get access to a stream with all of the posts from a company’s leadership team, for example.

But what’s maybe more important in this context is that tags now make it easy for employees to route content to everybody in the company, no matter which group they work in. “Even if you don’t know all employees across an organization, tags makes it easier to route content to the right folks,” the company explains in today’s blog post. “Soon you’ll be able to draft posts and see suggested tags, like #research or #customer-insights when posting customer survey results.”

As far as the new metrics go, there’s nothing all that exciting going on here, but G Suite customers who keep their reporting structure in the service will be able to provide analytics to employees so they can see how their posts are being viewed across the company and which teams engage most with them.

At the end of the day, none of these are revolutionary features. But the timing of today’s announcement surely isn’t a coincidence, given that Google announced the death of the consumer version of Google+ — and the bug that went along with that — only a few days ago. Today’s announcement is clearly meant to be a reminder that Google+ for the enterprise isn’t going away and remains in active development. I don’t think all that many businesses currently use Google+, though, and with Hangouts Chat and other tools, they now have plenty of options for sharing content across groups.

Oct
10
2018
--

Google’s Apigee officially launches its API monitoring service

It’s been about two years since Google acquired API management service Apigee. Today, the company is announcing new extensions that make it easier to integrate the service with a number of Google Cloud services, as well as the general availability of the company’s API monitoring solution.

Apigee API monitoring allows operations teams to get more insight into how their APIs are performing. The idea here is to make it easy for these teams to figure out when there’s an issue and what’s the root cause for it by giving them very granular data. “APIs are now part of how a lot of companies are doing business,” Ed Anuff, Apigee’s former SVP of product strategy and now Google’s product and strategy lead for the service, told me. “So that tees up the need for API monitoring.”

Anuff also told me that he believes that it’s still early days for enterprise API adoption — but that also means that Apigee is currently growing fast as enterprise developers now start adopting modern development techniques. “I think we’re actually still pretty early in enterprise adoption of APIs,” he said. “So what we’re seeing is a lot more customers going into full production usage of their APIs. A lot of what we had seen before was people using it for maybe an experiment or something that they were doing with a couple of partners.” He also attributed part of the recent growth to customers launching more mobile applications where APIs obviously form the backbone of much of the logic that drives those apps.

API Monitoring was already available as a beta, but it’s now generally available to all Apigee customers.

Given that it’s now owned by Google, it’s no surprise that Apigee is also launching deeper integrations with Google’s cloud services now — specifically services like BigQuery, Cloud Firestore, Pub/Sub, Cloud Storage and Spanner. Some Apigee customers are already using this to store every message passed through their APIs to create extensive logs, often for compliance reasons. Others use Cloud Firestore to personalize content delivery for their web users or to collect data from their APIs and then send that to BigQuery for analysis.

Anuff stressed that Apigee remains just as open to third-party integrations as it always was. That is part of the core promise of APIs, after all.

Oct
10
2018
--

Google introduces dual-region storage buckets to simplify data redundancy

Google is playing catch-up in the cloud, and as such it wants to provide flexibility to differentiate itself from AWS and Microsoft. Today, the company announced a couple of new options to help separate it from the cloud storage pack.

Storage may seem stodgy, but it’s a primary building block for many cloud applications. Before you can build an application you need the data that will drive it, and that’s where the storage component comes into play.

One of the issues companies have as they move data to the cloud is making sure it stays close to the application when it’s needed to reduce latency. Customers also require redundancy in the event of a catastrophic failure, but still need access with low latency. The latter has been a hard problem to solve until today when Google introduced a new dual-regional storage option.

As Google described it in the blog post announcing the new feature, “With this new option, you write to a single dual-regional bucket without having to manually copy data between primary and secondary locations. No replication tool is needed to do this and there are no network charges associated with replicating the data, which means less overhead for you storage administrators out there. In the event of a region failure, we transparently handle the failover and ensure continuity for your users and applications accessing data in Cloud Storage.”

This allows companies to have redundancy with low latency, while controlling where it goes without having to manually move it should the need arise.

Knowing what you’re paying

Companies don’t always require instant access to data, and Google (and other cloud vendors) offer a variety of storage options, making it cheaper to store and retrieve archived data. As of today, Google is offering a clear way to determine costs, based on customer storage choice types. While it might not seem revolutionary to let customers know what they are paying, Dominic Preuss, Google’s director of product management says it hasn’t always been a simple matter to calculate these kinds of costs in the cloud. Google decided to simplify it by clearly outlining the costs for medium (Nearline) and long-term (Coldline) storage across multiple regions.

As Google describes it, “With multi-regional Nearline and Coldline storage, you can access your data with millisecond latency, it’s distributed redundantly across a multi-region (U.S., EU or Asia), and you pay archival prices. This is helpful when you have data that won’t be accessed very often, but still needs to be protected with geographically dispersed copies, like media archives or regulated content. It also simplifies management.”

Under the new plan, you can select the type of storage you need, the kind of regional coverage you want and you can see exactly what you are paying.

Google Cloud storage pricing options. Chart: Google

Each of these new storage services has been designed to provide additional options for Google Cloud customers, giving them more transparency around pricing and flexibility and control over storage types, regions and the way they deal with redundancy across data stores.

Oct
10
2018
--

Google expands its identity management portfolio for businesses and developers

Over the course of the last year, Google has launched a number of services that bring to other companies the same BeyondCorp model for managing access to a company’s apps and data without a VPN that it uses internally. Google’s flagship product for this is Cloud Identity, which is essentially Google’s BeyondCorp, but packaged for other businesses.

Today, at its Cloud Next event in London, it’s expanding this portfolio of Cloud Identity services with three new products and features that enable developers to adopt this way of thinking about identity and access for their own apps and that make it easier for enterprises to adopt Cloud Identity and make it work with their existing solutions.

The highlight of today’s announcements, though, is Cloud Identity for Customers and Partners, which is now in beta. While Cloud Identity is very much meant for employees at a larger company, this new product allows developers to build into their own applications the same kind of identity and access management services.

“Cloud Identity is how we protect our employees and you protect your workforce,” Karthik Lakshminarayanan, Google’s product management director for Cloud Identity, said in a press briefing ahead of the announcement. “But what we’re increasingly finding is that developers are building applications and are also having to deal with identity and access management. So if you’re building an application, you might be thinking about accepting usernames and passwords, or you might be thinking about accepting social media as an authentication mechanism.”

This new service allows developers to build in multiple ways of authenticating the user, including through email and password, Twitter, Facebook, their phones, SAML, OIDC and others. Google then handles all of that authentication work. Google will offer both client-side (web, iOS and Android) and server-side SDKs (with support for Node.ja, Java, Python and other languages).

“They no longer have to worry about getting hacked and their passwords and their user credentials getting compromised,” added Lakshminarayanan, “They can now leave that to Google and the exact same scale that we have, the security that we have, the reliability that we have — that we are using to protect employees in the cloud — can now be used to protect that developer’s applications.”

In addition to Cloud Identity for Customers and Partners, Google is also launching a new feature for the existing Cloud Identity service, which brings support for traditional LDAP-based applications and IT services like VPNs to Cloud Identity. This feature is, in many ways, an acknowledgment that most enterprises can’t simply turn on a new security paradigm like BeyondCorp/Cloud Identity. With support for secure LDAP, these companies can still make it easy for their employees to connect to these legacy applications while still using Cloud Identity.

“As much as Google loves the cloud, a mantra that Google has is ‘let’s meet customers where they are.’ We know that customers are embracing the cloud, but we also know that they have a massive, massive footprint of traditional applications,” Lakshminarayanan explained. He noted that most enterprises today run two solutions: one that provides access to their on-premise applications and another that provides the same services for their cloud applications. Cloud Identity now natively supports access to many of these legacy applications, including Aruba Networks (HPE), Itopia, JAMF, Jenkins (Cloudbees), OpenVPN, Papercut, pfSense (Netgate), Puppet, Sophos and Splunk. Indeed, as Google notes, virtually any application that supports LDAP over SSL can work with this new service.

Finally, the third new feature Google is launching today is context-aware access for those enterprises that already use its Cloud Identity-Aware Proxy (yes, those names are all a mouthful). The idea here is to help enterprises provide access to cloud resources based on the identity of the user and the context of the request — all without using a VPN. That’s pretty much the promise of BeyondCorp in a nutshell, and this implementation, which is now in beta, allows businesses to manage access based on the user’s identity and a device’s location and its security status, for example. Using this new service, IT managers could restrict access to one of their apps to users in a specific country, for example.

 

Oct
10
2018
--

Google Cloud expands its networking feature with Cloud NAT

It’s a busy week for news from Google Cloud, which is hosting its Next event in London. Today, the company used the event to launch a number of new networking features. The marquee launch today is Cloud NAT, a new service that makes it easier for developers to build cloud-based services that don’t have public IP addresses and can only be accessed from applications within a company’s virtual private cloud.

As Google notes, building this kind of setup was already possible, but it wasn’t easy. Obviously, this is a pretty common use case, though, so with Cloud NAT, Google now offers a fully managed service that handles all the network address translation (hence the NAT) and provides access to these private instances behind the Cloud NAT gateway.

Cloud NAT supports Google Compute Engine virtual machines as well as Google Kubernetes Engine containers, and offers both a manual mode where developers can specify their IPs and an automatic mode where IPs are automatically allocated.

Also new in today’s release is Firewall Rules Logging, which is now in beta. Using this feature, admins can audit, verify and analyze the effects of their firewall rules. That means when there are repeated connection attempts that the firewall blocked, you can now analyze those and see whether somebody was up to no good or whether somebody misconfigured the firewall. Because the data is only delayed by about five seconds, the service provides near real-time access to this data — and you can obviously tie this in with other services like Stackdriver Logging, Cloud Pub/Sub and BigQuery to create alerts and further analyze the data.

Also new today is managed TLS certificated for HTTPS load balancers. The idea here is to take the hassle out of managing TLS certificates (the kind of certificates that ensure that your user’s browser creates a secure connection to your app) when there is a load balancer in play. This feature, too, is now in beta.

Sep
27
2018
--

Alphabet’s Chronicle launches an enterprise version of VirusTotal

VirusTotal, the virus and malware scanning service own by Alphabet’s Chronicle, launched an enterprise-grade version of its service today. VirusTotal Enterprise offers significantly faster and more customizable malware search, as well as a new feature called Private Graph, which allows enterprises to create their own private visualizations of their infrastructure and malware that affects their machines.

The Private Graph makes it easier for enterprises to create an inventory of their internal infrastructure and users to help security teams investigate incidents (and where they started). In the process of building this graph, VirtusTotal also looks are commonalities between different nodes to be able to detect changes that could signal potential issues.

The company stresses that these graphs are obviously kept private. That’s worth noting because VirusTotal already offered a similar tool for its premium users — the VirusTotal Graph. All of the information there, however, was public.

As for the faster and more advanced search tools, VirusTotal notes that its service benefits from Alphabet’s massive infrastructure and search expertise. This allows VirusTotal Enterprise to offers a 100x speed increase, as well as better search accuracy. Using the advanced search, the company notes, a security team could now extract the icon from a fake application, for example, and then return all malware samples that share the same file.

VirusTotal says that it plans to “continue to leverage the power of Google infrastructure” and expand this enterprise service over time.

Google acquired VirusTotal back in 2012. For the longest time, the service didn’t see too many changes, but earlier this year, Google’s parent company Alphabet moved VirusTotal under the Chronicle brand and the development pace seems to have picked up since.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com