a

Lorem ipsum dolor sit amet, consectetur adicing elit ut ullamcorper. leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet. Leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet.

  /  Project   /  Blog: All about Day 1 of Google I/O 2019: Android Q, Pixel 3a, Project Euphonia, Google Go, and more |…

Blog: All about Day 1 of Google I/O 2019: Android Q, Pixel 3a, Project Euphonia, Google Go, and more |…


Day one of Google I/O 2019 developer conference saw many impressive leaps in Artificial Intelligence, Augmented Reality, the launch of Pixel 3a and more.

The 2-hour Google I/O conference which began at 10 AM PT and went on for two hours. The event began with a keynote from the company’s CEO, Sundar Pichai.

I watched the event live, tweeted live updates and I won’t lie, there were moments when I couldn’t help but say ‘Woah!’. You will soon know why.

A lot of information was shared during these two hours, so let me break it down into smaller parts and make it less overwhelming.

Summarizing Day 1 of Google I/O 2019

Android Q and Pixel 3a

The key features of Android Q are security, privacy and digital well being. There are 2.5 Billion active Android users in the world.

As we can see around us, foldable phones are around the corner already. Android Q is equipped to take its full advantage.

For example, if you’re playing a game on your folded phone and then you open it up, the game screen will automatically spread on the bigger screen.

Android Q will also see the much awaited Dark theme which will save battery.

Pixel 3a is the company’s most affordable flagship phone as of now. The price starts at $399.

It will be available in two colors, ‘clearly white’ and ‘just black’.

According to the render leaks, the design of the new Pixel smartphones has chunky bezels, a plastic body, and a 3.5mm headphone jack.

Also Read: SheLeadsTech Summit 2019, 596 Women-Led Startups In 87 Cities

Major improvements in Google Search Results

Google Search is all set to reach another level. Podcasts will be introduced to the search results and you will be able to listen to them with just a single tap right from the results window.

Also, thanks to AR, now you can visualize what you search for by seeing it come alive in a 3d format. For example, if you search for a shark, you’ll be shown a 3d shark in the results.

Want it to be more exciting? You can add this shark to your surroundings and watch it come alive virtually.

ARe you seeing this?! With new AR features in Search rolling out later this month, you can place, view and interact with 3D objects right in your own space. #io19 pic.twitter.com/Q61U0r2Hvg

– Google (@Google) May 7, 2019

Also Read: Lenovo Z6 Pro with a 100-megapixel camera to be launched on April 23

If you’re not familiar with the Google Lens app, here is a quick introduction: Using the app you can point your camera to any object to identify it, find it online or know more details about it.

All of this happens through deep learning recognition process.

On day 1 of Google I/O 2019, Google shared an update that has made the lens even smarter.

When you go out to eat a restaurant, point your camera at the menu and Google Lens will highlight the top rated dishes for you. If you tap on a dish, it will show you it’s pictures and reviews shared by customers.

That’s not all still! It can even split your bill in any number that you want and calculate the tip you by scanning the bill.

While reading a recipe from a book placed on your table, you can point your Google Lens camera to the recipe to see the recipe in action in form of a video.

Google Go for Entry-Level Phones

Google Go is an app specifically designed for entry-level smartphones.

It was revealed at Google I/O that the app can understand and translate over 12 languages.

The best part? Google was serious when it said it is targeting entry-level phones.

Keeping in mind that they don’t usually come with a great internal storage space, the Google Go app is of merely a little over 100KB.

Launching first in Google Go, our Search app for first-time smartphone users, we’re working to help people who struggle with reading by giving Google Lens the ability to read text out loud. #io19 pic.twitter.com/YMVbZa5XMQ

– Google (@Google) May 7, 2019

Extending Google Duplex to the web

What is Google Duplex?

Google Duplex is artificial intelligence (AI) chat agent that can carry out specific verbal tasks, such as making a reservation or appointment, over the phone.

It is built on a recurrent neural network (RNN) using TensorFlow Extended, a general purpose machine learning platform used at Google.

How does Duplex coming on to web help you?

You can make reservations at your favorite restaurant without having to go through a long process of filling in forms.

Just tell the AI chat agent where and when you want to book seats and it will fill up the forms for you. You just have to confirm the details entered by a single tap.

This next generation assistant will come to Pixel phones later in 2019. It can operate, multi-task and compose offline.

The new assistant has the ability to perform 10 times faster and take commands simultaneously without having to say ‘Hey Google’ for each one of them.

Google has claimed about reaching a milestone by converting 100GB of data into just 0.5GB of data in your pockets.

We’re extending Duplex to the web, previewing how the Google Assistant can help you complete a task online, like renting a car or buying movie tickets. More later this year. #io19 pic.twitter.com/lSWH5Rz72X

– Google (@Google) May 7, 2019

Driving Mode on Google Assistant

The all new driving mode on Google Assistant will make communication with your assistant hands free.

While in the driving mode, the assistant will show the fastest route to the destination by searching for events marked in your calendar.

And if you receive a call while you’re driving, it will simply ask you if you want to take the call or cut it. You won’t have to move your hand away from the steering wheel to the phone to pickup calls anymore.

#HeyGoogle, let’s drive: Coming this summer on @android phones, the Google Assistant’s new driving mode features a thoughtfully designed dashboard with personalized suggestions for navigation, messaging, calling and media. #io19 pic.twitter.com/epJoJdqhX5

– Google (@Google) May 7, 2019

Incognito Mode in Maps

Your location history is both vital and private to you. With a goal to improve the privacy and security of its users, Google Maps now has the incognito mode.

While in the incognito mode, your location history won’t be saved on your device. Just as it is with incognito in Google Chrome.

Another feature that protects your location data is that you’ll be notified when an app is using your location in the background.

As demonstrated at the Google I/O, you’ll be asked in the notifications if you want to let the app use your location when it is not in use.

Live Caption for the Deaf and Hard-hearing community

The Live Caption feature does literally what it’s name is.

It was shared in the Google I/O that the device listens to what is being said and displays it live as a caption on the mobile’s screen without any delay.

This will make things easier for the deaf and hard-hearing community in their daily life.

A similar feature is known as Live Relay. It is focused on people who are mute or have other speech disabilities.

Using Live Relay, they can pick up calls and type what to they want to say at the very second.

Their typed text will be read out loud to the caller by the assistant.

All of the speech recognition and caption creating process is done entirely on the device without the need of Internet.

Built in collaboration with the Deaf and hard-of-hearing community, and coming later this year, Live Caption automatically captions media playing on your phone. #io19 pic.twitter.com/WhzwPuJWD0

– Google (@Google) May 7, 2019

If it has audio, now it can have captions. Live Caption automatically captions media playing on your phone. Videos, podcasts and audio messages, across any app-even stuff you record yourself. #io19 pic.twitter.com/XAW3Ii4xxy

– Google (@Google) May 7, 2019

Project Euphonia

“To understand and be understood is unbelievable.”

Through Project Euphonia, Google AI is conducting non-profit researches. It is aimed at helping people with speech impairments to use Assistant better.

The Assistant is being upgraded to understand facial expressions, hum sounds etc.

You can submit your audio samples to help Google in researching it in a better way. Submit here: g.co/euphonia

Partnering with nonprofits and volunteers, Project Euphonia is a @GoogleAI research effort to help people with speech impairments communicate faster and gain independence → https://t.co/JAzC1aMNZg #io19 pic.twitter.com/SBg4lru3RW

– Google (@Google) May 7, 2019

Let’s connect on Facebook, Twitter, and Instagram
Subscribe to never miss a post!


Originally published at https://garimashares.com on May 7, 2019.

Source: Artificial Intelligence on Medium

(Visited 12 times, 1 visits today)
Post a Comment

Newsletter