Page 6«..5678..2030..»

Category Archives: Google

Google is shutting down Google One VPN because ‘people simply weren’t using it’ – ZDNet

Posted: April 16, 2024 at 10:46 am

Lance Whitney/ZDNET

Google One subscribers who use the plan's VPN service will have to find another way to secure their internet connections. As spotted by Android Authority, Google sent an email to some Google One subscribers this week, and buried an announcement at the end saying that it will discontinue the plan's VPN feature. Though no specific retirement date was given, expect the VPN to be jettisoned from Google One as soon as sometime in the coming months and certainly before the end of 2024.

When asked why it's retiring the VPN service, a Google spokesperson told ZDNET: "We're refocusing our efforts to support more in-demand features with Google One. To keep our subscription service fresh, we're discontinuing the VPN feature, as we found people simply weren't using it."

Introduced in 2018,Google One is a subscription-based planthat bundles a variety of features across tiers ranging from $1.99 per month to $19.99 per month. The basic levels offer cloud storage, dark web monitoring, Google Photos editing tools, and for now, the VPN. The more expensive tiers include 10% back in the Google Store, Google Workspace premium features, Gemini Advanced, and Gemini in Gmail and Docs.

Also: What is Google One, and is it worth it?

Added in October 2020, theVPN was initially only available with certain paid plansand only on Android devices. Over time, Googleexpanded the featureto cover all paid Google One plans and kicked in support for iOS, Windows, and macOS.

Designed to protect multiple devices, the VPN is especially effective if you're using unsecured Wi-Fi networks in public places. In a white paper, Google explains exactly how the VPN works and how it secures your internet connections.

All the above reasons are why many Google One users will likely lament the loss of such a useful security tool. Unfortunately, Google has a history of killing products and services that it feels are no longer worth its time and effort. Thankfully, people who need a new VPN have other options.

Google offers a VPN through its Fi wireless subscription, which spans monthly data plans ranging from $35 to $110. The Google Fi VPN supports both Android devices and iPhones.

Recent Pixel owners can also enjoy a free Google VPN. The Pixel 7, Pixel 7 Pro, Pixel 7a, and Pixel Fold will receive updates in June to give them the same built-in VPN found on Pixel 8 and Pixel 8 Pro phones.

Plus, there are a host of third-party VPN tools available in the Google Play Store. For even more options, check out ZDNET's story on the best mobile VPNs.

Read the rest here:

Google is shutting down Google One VPN because 'people simply weren't using it' - ZDNet

Posted in Google | Comments Off on Google is shutting down Google One VPN because ‘people simply weren’t using it’ – ZDNet

Meta and Google announce new in-house AI chips, creating a trillion-dollar question for Nvidia – Fortune

Posted: at 10:46 am

Hardware is emerging as a key AI growth area. For Big Tech companies with the money and talent to do so, developing in-house chips helps reduce dependence on outside designers such as Nvidia and Intel while also allowing firms to tailor their hardware specifically to their own AI models, boosting performance and saving on energy costs.

These in-house AI chips that Google and Meta just announced pose one of the first real challenges to Nvidias dominant position in the AI hardware market. Nvidia controls more than 90% of the AI chips market, and demand for its industry-leading semiconductors is only increasing. But if Nvidias biggest customers start making their own chips instead, its soaring share price, up 87% since the start of the year, could suffer.

From Metas point of view it gives them a bargaining tool with Nvidia, Edward Wilford, an analyst at tech consultancy Omdia, told Fortune. It lets Nvidia know that theyre not exclusive, [and] that they have other options. Its hardware optimized for the AI that they are developing.

Why does AI need new chips?

AI models require massive amounts of computing power because of the huge amount of data required to train the large language models behind them. Conventional computer chips simply arent capable of processing the trillions of data points AI models are built upon, which has spawned a market for AI-specific computer chips, often called cutting-edge chips because theyre the most powerful devices on the market.

Semiconductor giant Nvidia has dominated this nascent market: The wait list for Nvidias $30,000 flagship AI chip is months long, and demand has pushed the firms share price up almost 90% in the past six months.

And rival chipmaker Intel is fighting to stay competitive. It just released its Gaudi 3 AI chip to compete directly with Nvidia. AI developersfrom Google and Microsoft down to small startupsare all competing for scarce AI chips, limited by manufacturing capacity.

Why are tech companies starting to make their own chips?

Both Nvidia and Intel can produce only a limited number of chips because they and the rest of the industry rely on Taiwanese manufacturer TSMC to actually assemble their chip designs. With only one manufacturer solidly in the game, the manufacturing lead time for these cutting-edge chips is multiple months. Thats a key factor that led major players in the AI space, such as Google and Meta, to resort to designing their own chips. Alvin Nguyen, a senior analyst at consulting firm Forrester, told Fortune that chips designed by the likes of Google, Meta, and Amazon wont be as powerful as Nvidias top-of-the-line offeringsbut that could benefit the companies in terms of speed. Theyll be able to produce them on less specialized assembly lines with shorter wait times, he said.

If you have something thats 10% less powerful but you can get it now, Im buying that every day, Nguyen said.

Even if the native AI chips Meta and Google are developing are less powerful than Nvidias cutting-edge AI chips, they could be better tailored to the companys specific AI platforms. Nguyen said that in-house chips designed for a companys own AI platform could be more efficient and save on costs by eliminating unnecessary functions.

Its like buying a car. Okay, you need an automatic transmission. But do you need the leather seats, or the heated massage seats? Nguyen said.

The benefit for us is that we can build a chip that can handle our specific workloads more efficiently, Melanie Roe, a Meta spokesperson, wrote in an email to Fortune.

Nvidias top-of-the-line chips sell for about $25,000 apiece. Theyre extremely powerful tools, and theyre designed to be good at a wide range of applications, from training AI chatbots to generating images to developing recommendation algorithms such as the ones on TikTok and Instagram. That means a slightly less powerful, but more tailored chip could be a better fit for a company such as Meta, for examplewhich has invested in AI primarily for its recommendation algorithms, not consumer-facing chatbots.

The Nvidia GPUs are excellent in AI data centers, but they are general purpose, Brian Colello, equity research lead at Morningstar, told Fortune. There are likely certain workloads and certain models where a custom chip might be even better.

The trillion-dollar question

Nguyen said that more specialized in-house chips could have added benefits by virtue of their ability to integrate into existing data centers. Nvidia chips consume a lot of power, and they give off a lot of heat and noiseso much so that tech companies may be forced to redesign or move their data centers to integrate soundproofing and liquid cooling. Less powerful native chips, which consume less energy and release less heat, could solve that problem.

AI chips developed by Meta and Google are long-term bets. Nguyen estimated that these chips took roughly a year and a half to develop, and itll likely be months before theyre implemented at a large scale. For the foreseeable future, the entire AI world will continue to depend heavily on Nvidia (and, to a lesser extent, Intel) for its computing hardware needs. Indeed, Mark Zuckerberg recently announced that Meta was on track to own 350,000 Nvidia chips by the end of this year (the companys set to spend around $18 billion on chips by then). But movement away from outsourcing computing power and toward native chip design could loosen Nvidias chokehold on the market.

The trillion-dollar question for Nvidias valuation is the threat of these in-house chips, Colello said. If these in-house chips significantly reduce the reliance on Nvidia, theres probably downside to Nvidias stock from here. This development is not surprising, but the execution of it over the next few years is the key valuation question in our mind.

Read more from the original source:

Meta and Google announce new in-house AI chips, creating a trillion-dollar question for Nvidia - Fortune

Posted in Google | Comments Off on Meta and Google announce new in-house AI chips, creating a trillion-dollar question for Nvidia – Fortune

Google’s Pixel 8A leaks in all colors including a bold green – The Verge

Posted: at 10:46 am

Were a month out from Google I/O, where the company will likely announce its new Pixel 8A phone. At this point, there have already been plenty of renders and even some real-world shots. And now, courtesy of Android Headlines, it appears official marketing images of the device have leaked.

If accurate, they show that Google plans to offer the Pixel 8A in four colors: black, porcelain, blue, and a very vibrant green. Its certainly more saturated than the subdued mint that Google added to the Pixel 8 color choices earlier this year.

Like its predecessor, the phone is rumored to have a 6.1-inch display, though its not yet clear whether itll top out at 90Hz or 120Hz. The 8A still has thicker bezels than the flagship Pixel 8 and 8 Pro especially at the bottom. Just like those devices, itll be powered by Googles Tensor G3 chip.

The rumor mill has indicated that this could be the final A-series Pixel phone at least for a while. Later this year, Google is expected to release three versions of the Pixel 9: the Pixel 9, Pixel 9 Pro, and Pixel 9 Pro XL. All of them will have different screen sizes (and prices). Toss in an eventual Pixel Fold 2, and there just might not be enough room in the lineup for a 9A. Even from a pricing perspective, there just isnt much room between the 7A and constantly-on-sale Pixel 8 models.

Googles I/O 2024 keynote is scheduled for May 14th. Compared to the many 8A leaks, we havent seen quite as much regarding the Pixel Fold 2, so its feasible that the company could hold that one back until October to bring the whole premium series into alignment with the same Tensor G4 processor.

Go here to see the original:

Google's Pixel 8A leaks in all colors including a bold green - The Verge

Posted in Google | Comments Off on Google’s Pixel 8A leaks in all colors including a bold green – The Verge

Milan Design Week 2024: Google and Chromasonic Transform Light Into Sound for Making Sense of Color Exhibition … – Cool Hunting

Posted: at 10:46 am

An ethereal immersion into the power and interconnectedness of our senses

Unless youve been following Salone del Mobile for the last few years, Google might not be the first organization to come to mind when you hear the words Milan Design Week. And yet, for attendees, years after year, the tech pioneer has continued to explore the world of design through our interconnected sensesand our abundance of feelingswith exhibitions like A Space for Being with Suchi Reddy and Shaped by Water with Lachlan Turczan. This year, at Garage 21 from 15-21 April, Googles Vice President of Hardware Design, Ivy Ross, will present Making Sense of Colorin collaboration with arts and research labChromasonic. The immersive installation, which translates light into sound is as spectacular as it is serene.

In Making Sense of Color, translucent scrims form three rows of seven partially enclosed spaces. All 21 of these nodes are enhanced with a dedicated color-shifting light source and enveloping spatialized audio. With Chromasonics refrequencing technology, the frequencies emanating from a soothing lightscape are translated into sound while guests pass from space to space, engaging with others or finding their own peace in the waves of color.

We create a condition where you can see sound or hear light

At a really fundamental level, we connect light frequencies and sound frequencies, multimedia artist Johannes Girardoni, Chromasonics cofounder, explains to COOL HUNTING on site. We map light waves to sound waves to help us all expand our perception. We use technology to do that. We create a condition where you can see sound or hear light because we are aligning all the frequencies and waves algorithmically. In person, an immediate calm descends as one steps into the experiential space.

In addition to being a sensory immersion, its also a tale of human connectionwith pleasant encounters shared in the space. Its a use of technology that allows us to move our presence into ourselves, to move into the presence of others, and to connect through community, Girardoni continues. The physical spatial expression of this installation creates an elasticization of space through the movement of light and sound so space may appear to expand and contract. When youre in the space with other participants, they may appear or disappear. You notice yourself. You notice others.

Ross explains that Chromasonics work at the intersection of art and science is what appealed to Google. They embrace both physical and virtual sensory technologies to create visceral experiences that really resonate with us and relate to how we approach design, she tells us. In the case ofMaking Sense of Color, their portion of the exhibit embodies the answer to the question, what does color sound like?'

Color gives life a pulse

At the Google Hardware Design Studio we are always considering the sensorial nature of what we design,color being an important aspect, Ross says. Each color transmits a different vibration. That vibration has a biological and psychological effect on us. Color gives life a pulse. Color resonates with vibrancy, embodying energy, evoking emotion. Right now we are going through a lot of emotion as a society so understanding the power of color and its different properties feels relevant.

As for the importance of Milan Design Week, Ross adds that, we believe that Salone is the best showcase for design, attracting folks from all over the world. It is the best place to share with the world the thought leadership of the Google Hardware Design group. Our Making Sense of Color experience culminates in a feast for the eyes that shows how color comes to life through the design of Googles hardware portfolio that will be on display. For those in Milan for Salone del Mobile, its a cant missand for those who are mesmerized from afar, the Chromasonic portion of the installation will travel to other destinations in the future.

Milan Design Week 2024: Google and Chromasonic Transform Light Into Sound for Making Sense of Color Exhibition

This placeholder is removed when the ad slot is configured.

View post:

Milan Design Week 2024: Google and Chromasonic Transform Light Into Sound for Making Sense of Color Exhibition ... - Cool Hunting

Posted in Google | Comments Off on Milan Design Week 2024: Google and Chromasonic Transform Light Into Sound for Making Sense of Color Exhibition … – Cool Hunting

I tested the Google Pixel’s Long Exposure photo mode and it’s another reason to leave my pro mirrorless camera at … – TechRadar

Posted: at 10:45 am

Google's Long Exposure photo mode is actually decent. There, I said it. Photographer me is putting his neck on the line by saying that another smartphone computational photography mode, recently given its own tab in Google's revamped Camera app, is one less reason to use a 'proper' camera and mine's a TechRadar-approved best mirrorless camera, no less.

I was on a short family break at the coast recently and set an early alarm to sneak out for a little solo time at first light at a secluded cove nearby. It would be me, the gentle lapping waves, and hopefully a little color in the sky. Of course, I would take a camera too.

Hot tea in a travel flask, banana, notepad and pen, mirrorless camera, two pro lenses covering the 24-200mm focal length between them, an ND filter plus a tripod, and I was good to go. Oh, and the Google Pixel 6 was in my pocket.

Image 1 of 3

A steep descent through a wooded area and the sheltered east-facing cove came into view. I've learned the importance of enjoying nature first before taking a camera out of the bag, especially given my screen-intensive day job.

After grounding myself in the peace and unrushed pace of the quiet sunrise I started moving around the beach looking for compositions that caught my eye, for photos that would transport me back to what it was like being there.

Sunrise was lovely not award-winning, but adding a splash of color. The outgoing tide was steadily revealing more of the beach. Small waves crashed against the clay-red sandy incline, climbed up the beach a little, and then retreated around small rocks, creating interesting patterns.

Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.

I've taken a few long exposure seascape photos down the years, and love the technique, especially for accentuating the movement of water as it retreats around rocks. I take a quick snap of the scene on the Pixel 6 and it occurs to me that I've not properly used its Long Exposure photo mode yet, now prominent in the camera app with its own tab.

Image 1 of 2

The Long Exposure photo mode blurs movement, while keeping still objects sharp. The creative technique can be used in several ways, with blurring moving water a popular choice. Having observed the water trails, I line up the picture and take the snap.

It works a little like Night Sight you need to keep your phone as steady as possible while the long exposure is captured. That way the still objects in this case the rocks, cliff faces, and untouched sand remain sharp. This computational photography mode is like a pro mirrorless camera's in-body image stabilization on steroids.

The phone stores both the regular photo and the long exposure effect image (I've included both versions of every image for comparison). I have to say, the effect in this scenario is convincing (see above), similar to what I'd expect from my mirrorless camera which remains in the bag 50 meters away up the beach.

Whatever camera you use for long exposure photography (be it mirrorless or a cameraphone) in this context of accentuating retreating ocean waters you need to keep trying and trying and trying to get the shot. Timing is so hard.

Your best bet is starting the capture with the wave at its peak up the beach and just as the water starts to retreat. That way the natural path back to the ocean, be it straight or snaking around rocks, is accentuated and depicts the tidal energy.

Image 1 of 2

Google Pixel's Long Exposure mode isn't perfect detail is usually softer than in the standard version but it's pretty darn good and convincing enough that I didn't really need to bring my mirrorless camera, tripod, and ND filters along for the ride. If I owned the OM System OM-1 II (or OM-1), I could use that camera's Live ND computational photography mode instead and leave the tripod and ND filters behind.

I haven't lost faith in my 'proper' camera, far from it. Towards the end of my time at the beach, while still alone, a playful seal popped its head up like a floating rock. I steamed back up the beach to my bag, grabbed the camera with a 70-200mm lens, and got a few photos that far exceed what I could possibly hope to get with the Pixel 6 though some of today's best cameraphones might have done a decent job.

I'll also still use my 'proper' camera with tripod and ND filters for long exposure photography, too. It's just that now I might think twice if lugging all of that gear to get the creative effect is worth it when I have the computational mode in a device that slips into my pocket.

Go here to see the original:

I tested the Google Pixel's Long Exposure photo mode and it's another reason to leave my pro mirrorless camera at ... - TechRadar

Posted in Google | Comments Off on I tested the Google Pixel’s Long Exposure photo mode and it’s another reason to leave my pro mirrorless camera at … – TechRadar

Google Wallet ‘verify it’s you’ request appears minutes after unlock – 9to5Google

Posted: at 10:45 am

Users have noticed in recent weeks that Google is requiring device unlocks for every tap-to-pay transaction regardless of the amount. At the same time, Google Wallet appears to be testing a second change related to more frequent verification.

Officially, Google says your credit and debit card wont be charged for retail payments unless youve recently used a verification method, like your fingerprint or PIN.

While the support documents dont specify the exact duration of recently, people myself included have noticed it get shorter.

To give an example: When Im in line at the supermarket, I unlock and use my phone until I get to the register. Usually, the phone is still active (screen on) when its time to pay. This week, when I tapped the terminal, my Pixel 8 asked me to authenticate again. In the past, Ive never had to verify during this specific tap-to-pay situation.

Since then, Ive found that three minutes after initially unlocking via fingerprint, Google Wallet wants me to re-authenticate.

In fact, in testing with a timer, Ive noticed a new For your security, you need to verify its you before paying prompt at the top of the Google Wallet app. This is only appearing on that Pixel 8 where I encountered the change.

The message did not appear on two other Pixel phones that I had side-by-side during the at-home tests unlock phone via fingerprint, keep the screen active, and open/close Google Wallet at 1, 2, and 3-minute intervals until the verify its you prompt appears Ive performed.

This suggests Google is either still testing this behavior or has yet to widely roll it out.

For comparison, Apple Pay on the iPhone requires that you authenticate every tap-to-pay transaction. Android and Google Wallet is moving closer in that direction, but is still providing more leeway.

FTC: We use income earning auto affiliate links. More.

More:

Google Wallet 'verify it's you' request appears minutes after unlock - 9to5Google

Posted in Google | Comments Off on Google Wallet ‘verify it’s you’ request appears minutes after unlock – 9to5Google

The Google Pixel 8a leaks twice, hinting at its design, and four color options – TechRadar

Posted: at 10:45 am

Nothing is official yet, but if we had to make an educated guess, we'd say the Google Pixel 8a is going to be unveiled on the first day of Google I/O 2024, which is May 14. Now two new leaks have given us more of an idea about what to expect from the handset.

To begin with we've got leaked renders of the Pixel 8a courtesy of Android Headlines. There are four colors on show here, apparently called Mint, Porcelain, Obsidian, and Bay (or light green, pale gray, dark gray, and light blue, as they're otherwise known).

These colors are similar to the ones we saw for the Google Pixel 7a, though Mint appears to have replaced Coral (orange). Mint is an option on the Pixel 8 and the Pixel 8 Pro, though here it looks a lot more garish which might just be due to the way the image is edited.

As Android Headlines points out, we also got a paler Mint color with the Google Pixel 6a in 2022, so this wouldn't be a first for the mid-range series. We noticed that the Pixel 6a was recently removed from sale on the Google Store, leaving space for the Pixel 8a.

The renders we can see here back up previous leaks: the design is similar to the Pixel 8 and indeed the Pixel 7a. It's possible that the corners are going to be slightly more curved, but there's not a lot in it, and this is a phone that still looks very much like a Pixel.

Google may have already revealed the Pixel 8a design in an advert for Google Fi Wireless, and the picture in that ad does match the renders from Android Headlines. The colors seem plausible too provided that green gets toned down a bit.

Elsewhere in Pixel 8a leak news, serial tipster Evan Blass has spotted some Pixel 8a tutorials have gone live on the website of a US carrier not ideal from Google's perspective. How long they remain up remains to be seen.

Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.

Another potential upgrade we've heard about is a bump to a 120Hz screen, though the Tensor G3 chip may be underclocked to keep the phone below the Pixel 8 in terms of performance. In around a month's time, all should be revealed.

See the article here:

The Google Pixel 8a leaks twice, hinting at its design, and four color options - TechRadar

Posted in Google | Comments Off on The Google Pixel 8a leaks twice, hinting at its design, and four color options – TechRadar

Google Vids is Google’s fourth big productivity app for Workspace – Ars Technica

Posted: at 10:45 am

Is that Google Slides? Nope it's Google Vids, the new video editor that seems to just make souped-up slideshows.

Google

Google's demo starts with an existing slideshow and then generates an outline.

Google

Choose a theme, which all look like PowerPoints.

Google

Write a script, preferably with the help of Google Gemini.

Google

You can record a voiceover, or pick from Google's robot voices.

Google

This is a Google Workspace app, so there's lots of realtime collaboration features, like these live mouse cursors that were brought over from Slides.

Google

Comments work too.

Google

It's interesting you get a "stock media" library while apps like Slides would use generative AI images here.

Google

Record a talk from your webcam.

Google

Embed your video in the slideshow.

Google

If you had asked me before what Google's video editor app was, I would say "YouTube Studio," but now Google Workspace has a new productivity app called "Google Vids." Normally a video editor is considered a secondary application in many productivity suites, but Google apparently imagines Vids as a major pillar of Workspace, saying Vids is an "all-in-one video creation app for work that will sit alongside Docs, Sheets and Slides." So, that is an editor for documents, spreadsheets, presentations, and videos?

Google's demo of the new video editor pitches the product not for YouTube videos or films but more as a corporate super slideshow for things like training materials or product demos. Really, this "video editor" almost looks like it could completely replace Google Slides since the interface is just Slides but with a video timeline instead of a slideshow timeline.

Google's example video creates a "sales training video" that starts with a Slides presentation as the basic outline. You start with an outline editor, where each slideshow page gets its own major section. Google then has video "styles" you can pick from, which all seem very Powerpoint-y with a big title, subheading, and a slot for some kind of video. Google then wants you to write a script and either read it yourself or have a text-to-speech voice read the script. A "stock media" library lets you fill in some of those video slots with generic corporate imagery like a video of a sunset, choose background music, and use a few pictures. You can also fire up your webcam and record something, sort of like a pre-canned Zoom meeting. After that it's a lot of the usual Google productivity app features: real-time editing collaboration with visible mouse cursors from each participant and a stream of comments.

Like all Google products after the rise of OpenAI, Google pitches Vids as an "AI-powered" video editor, even though there didn't seem to be many generative AI features in the presentation. The videos, images, and music were "stock" media, not AI-generated inventions (Slides can generate images, but that wasn't in this demo). There's nothing in here like OpenAI's "Sora," which generates new videos out of its training data. There's probably a Gemini-powered "help me write" feature for the script, and Google describes the initial outline as "generated" from your starting Slides presentation, but that seemed to be it.

Google says Vids is being released to "Workspace Labs" in June, so you'll be able to opt in to testing it.

Listing image by Google

The rest is here:

Google Vids is Google's fourth big productivity app for Workspace - Ars Technica

Posted in Google | Comments Off on Google Vids is Google’s fourth big productivity app for Workspace – Ars Technica

Google working to prevent accidental Circle to Search activations – 9to5Google

Posted: at 10:45 am

The latest Made by Google Podcast episode talks to the development team behind Circle to Search (CtS).

The team spent a ton of time thinking about what is the fastest way to access, because we knew that could make or break the product. It was important to them that you could access it anywhere across the OS. They of course landed on long-pressing the gesture bar (referred to as the home handled today) or home button to activate Circle to Search.

With 3-button navigation, CtS replaces Assistant activation, with users having to use the app shortcut, power button, or hotword. Some that use gesture navigation with its wide but short touch target, complain about accidentally triggering Circle to Search. Fortunately, Google is aware of this:

We still have further to go, and were working a lot on making sure its triggered when you want, [and] its not triggered when you dont want.

Also of note is explicit acknowledgment that Circle to Search actually works with Lens.

A lot of the technology is in fact Lens, the visual searching capabilities, OCR on the screen. The key differences are rarely that its universally available.

The animation that sweeps across your screen is called the shimmer, while development on CtS started around January of 2023. Google acknowledges that tapping to select (like in Lens) is faster than circling, but that the team found the circle to be delightful. Circle to Search also accepts highlighting (or crossing out text) and squiggling over. The team is finding that copying text is an increasingly popular use case.

And so when youre doing a circle, we have very, very finely tuned the region selection over lots and lots of testing so that its extremely accurate.

Besides launching in-line translation, Google has a lot more planned for Circle to Search: we have lots of ideas about what to do. One thing that was suggested is a merging of both the Search result page with the Lens result page, and thats going to be rolling out over many, many months.

FTC: We use income earning auto affiliate links. More.

See the original post here:

Google working to prevent accidental Circle to Search activations - 9to5Google

Posted in Google | Comments Off on Google working to prevent accidental Circle to Search activations – 9to5Google

Google Unleashes ‘New Era Of Productivity’ With AI Agents: Partners – CRN

Posted: at 10:45 am

Google Cloud partners break down the benefits of Googles new GenAI agent technology and Vertex AI Agent Builder for the channel.

Google Cloud unveiled a slew of new AI products this week at Google Cloud Nextfrom new AI agents to new Tensor Processing Unit hardware to accelerate the artificial intelligence revolution.

Google partners, in particular, were bullish about Googles new Vertex AI Agent Builder and agent technologies that will help them drive customer productivity levels to new heights.

Googles agents are the manifestation of real value in remedial workflows. So we can help customers go from an agent being partially helpfullike in a traditional chatbot scenario that we think aboutto actually completing a complex task, said Tony Safoian, CEO of Google Cloud Premier Partner SADA, an Insight company.

Partners of the Mountain View, Calif.-based $37 billion cloud computing and AI giant can create custom tasks and capabilities for agents in Vertex AI Agent Buildera new no-code offering for deploying generative AI assistants that are rooted in Googles Gemini family of models.

For example, if someone is asking about coverage for a complex health-care [plan], it used to be that the agent can give you part of the answer. Now theres no reason these new powerful agents can not only give you the complete answer, but also maybe change your health plan in the background to meet your specific health needs that are now different than they were last year, said SADAs CEO.

[Related: Google Cloud CEOs 5 Bold Remarks On New Chip, AI Agents And The New Way To Cloud]

So things that youd normally partially surface to a human for the human to complete, right now were entering the era of automation and trust in these AI capabilities to complete the task of actually maybe changing your membership, changing your subscription, changing your deductible, etc. to complete the circle of the workflow, said Safoian. Another easy example is you lost your password. It used to tell you, Heres the steps to follow. Now, it could just do those steps and reset the password for you and text you the password after. This kind of workflow completion is going to unleash a new era of productivity.

To help partners create these AI agents for customers, Google unveiled its new Vertex AI Agent Builder this week.

The new Vertex AI Agent Builder brings together foundation models, Google Search and other developer tools for partners to build and deploy agents, alongside orchestration and augmentation capabilities. With Vertex AI Agent Builder, Google said partners can now quickly create a range of generative AI agents grounded with Google Search and customer data.

Quantiphi, a Google Cloud Premier Partner and AI superstar, said Vertex AI Agent Builder will help customers solve business problems and achieve their AI goals. Quantiphi CEO Asif Hasan said Google agents are the next evolution of AI assistants.

The assistant way of doing things is you have a search bar and you can ask any questions, and the system will go fetch the answer and give it to you so you dont have to browse links and all that type of stuffthats like the assistant paradigm, he said.

The agent paradigm is where, instead of asking a question, youre giving it a goal. Then its breaking down the goal: its reasoning, its planning, its generating the first draft of the answer, it is reflecting on the answer, and its making it better, said Hasan. And then through an iterative set of steps, its either reflecting on its own or its asking you for feedback. Then at the end of the process, its going to help you accomplish your goal. Were super excited about it.

Google also launched new enhancements this week to help organizations build data agents, including injecting Gemini into Googles popular products such as BigQuery, Databases and Looker.

Another area that excited partners at Google Cloud Next was Googles push to create microprocessor hardware that will power Googles and Google customers AI ambitions.

Google this week unveiled its first-ever custom ARM-based CPU chip designed for the data center with Google Axion, delivering up to 50 percent better performance than comparable current-generation x86-based instances, Google said.

In addition, Googles new AI accelerator Tensor Processing Unit v5p is now generally available. Each TPU v5p pod is composed of nearly 9,000 chips over Googles highest-bandwidth inter-chip interconnect (ICI).

The customer hardware piece of AI is becoming very interesting, said Quantiphis CEO. What weve seen with the recent events in the market is there is an insatiable demand for custom AI hardware.

The combination of CPUs and GPUs, or TPUs, is what organizations will leverage in many AI workflows.

The workhorse in AI is an accelerated compute chip, which is either a GPU or Googles variant, which is even a more specialized Tensor Processing Unit, said Hasan. Googles [innovation engine] here is amazing.

SADAs CEO echoed Hasan in believing that Google Clouds end-to-end AI portfolio is a market differentiator.

So when you look at what Google is doing, youre really talking about the entire spectrum of valuefrom the core chips required for any company to building those types of things on Google, and packaging them around Workspace and other SaaS technologies, said Safoian.

The challenge is and will continue to be the collective partner ecosystem helping our customers derive real value out of these building blocks in a way thats measurable, thats credible, thats scalable, thats repeatable, thats secure, thats safe and easy to operate, Safoian said. So theres a tremendous amount of work for the partner ecosystem to do.

Google Cloud Next in Las Vegas runs from April 9 to April 11.

See the rest here:

Google Unleashes 'New Era Of Productivity' With AI Agents: Partners - CRN

Posted in Google | Comments Off on Google Unleashes ‘New Era Of Productivity’ With AI Agents: Partners – CRN

Page 6«..5678..2030..»