41
submitted 10 months ago by Welp_im_damned@lemdro.id to c/android@lemdro.id
top 28 comments
sorted by: hot top controversial new old
[-] sciencesebi@feddit.ro 8 points 10 months ago

"The first phone with AI built in."

LOL Google are dellirious

What about autocomplete? Face detection? Virtual assistants

[-] lolcatnip@reddthat.com 6 points 10 months ago

"AI" is a pretty meaningless term. It's impossible to say objectively whether any of the things you mentioned should be considered AI.

[-] sciencesebi@feddit.ro 1 points 10 months ago* (last edited 10 months ago)

IEEE defines it as any software whos actions automate a human behavior. All those fall under the definition.

[-] lolcatnip@reddthat.com 3 points 10 months ago

That could mean something as simple as arithmetic.

[-] evo@sh.itjust.works 4 points 10 months ago

No, that might be accurate for what they are talking about. The absolute smallest Generative AI models (that are generally useful) are starting to shrink but are still several GB in size. Doing this on device is actually new.

[-] sciencesebi@feddit.ro 2 points 10 months ago

It says AI not genAI. Anyway, autocomplete is genAI, even though it may be simple glove embeddings and MC.

You don't know what the fuck you're talking about.

[-] evo@sh.itjust.works 1 points 10 months ago* (last edited 10 months ago)

Do you know how to read?

Gemini Nano now powers on-device generative AI features for Pixel 8 Pro

Technically auto complete can be considered Gen AI, but it obviously lacks the creativity that we all associate with Gen AI today. You don't need a model that is generally useful to do auto complete.

The point is it didn't take a generally useful Gen AI model to do auto complete before but Google is now shipping features (beyond auto complete) that use such a model. Gen AI on device is novel.

[-] sciencesebi@feddit.ro 2 points 10 months ago* (last edited 10 months ago)

I was talking about the title, not the 10th paragraph way down. Use your reading skills and tell me where the fuck "generative" is in the title.

No. Autocomplete is a feature. The model behind it can be gen AI and was for a number of years. IDGAF if it's not general purpose.

The point it you have no fucking clue what you're defending. LLMs and diffusion models have been in apps for months. You can say that General purpose LLMs embedded into mobile OS functions is novel, the rest of it is bullshit.

[-] tja@sh.itjust.works 2 points 10 months ago

It's directly in the first paragraph...

[-] evo@sh.itjust.works 1 points 10 months ago

where the fuck "generative" is in the title

LLMs and diffusion models have been in apps for months.

Show me a single example of an app that has an LLM on device. Find a single one that isn't making an API call to a powerful server running the LLM. Show me the app update that adds a multi gigabyte LLM into the device. I'll wait...

Feel free to not respond when you realize you are wrong and you have no clue what everyone else is talking about.

[-] sciencesebi@feddit.ro 1 points 10 months ago* (last edited 10 months ago)

Are you familiar with the difference between title and paragraph? Apparently not.

Answered the same question here

Feel free to not respond when you realize you are wrong and you have no clue what I'm talking about.

[-] evo@sh.itjust.works 1 points 10 months ago

You didn't list a single production app in that post...

[-] sciencesebi@feddit.ro 1 points 10 months ago

https://llm.mlc.ai/docs/deploy/android.html

Or does it have to be on the play store or some other BS you use to backpedal?

[-] evo@sh.itjust.works 1 points 10 months ago

Reading comprehension is not your strong suit.

[-] butter@midwest.social 3 points 10 months ago

AI is broad enough that it does include those features.

But it's probably referring to machine learning.

[-] sciencesebi@feddit.ro 4 points 10 months ago

That's my point. AI includes features that were added years ago. Even ML is too broad. Autocomplete uses small ML models. Spam filters as well.

I think they mean LLMs, and specifically distilled BARDs. So a subset of a subset of a subset of AI.

Neckbeard marketing

[-] quirzle@kbin.social 2 points 10 months ago

What about autocomplete? Face detection? Virtual assistants

How much of that is really built-in vs. offloaded to their cloud then cached locally (or just not usable offline, like Assistant)?

[-] sciencesebi@feddit.ro 2 points 10 months ago
[-] quirzle@kbin.social 3 points 10 months ago

Services running in GCP aren't built into the phone, which is kinda the main point of the statement you took issue with.

[-] sciencesebi@feddit.ro 1 points 10 months ago

What does that have to do with CACHING? That's client server.

No clue what you're talking about

[-] evo@sh.itjust.works 2 points 10 months ago

That's the entire point. Running the LLM on device is what's new here...

[-] sciencesebi@feddit.ro 2 points 10 months ago

MLC LLM does the exact same thing. Lots of apps have low quality LLMs embedded in chat apps. Low res image generation apps via diffusion models similar to DallE mini have been around a while.

Also Qualcomm Used its AI stack to deploy SD to mobile back in February. And this is not the low res one.

Think before you write.

[-] evo@sh.itjust.works 1 points 10 months ago

I can't find a single production app that uses MLC LLM (because of the reasons I listed earlier (like multi GB models that aren't garbage).

Qualcomm announcement is a tech demo and they promised to actually do it next year...

[-] sciencesebi@feddit.ro 1 points 10 months ago

Who said about production and non-garbage? We're not talking quality of responses or spread. You can use distilled roberta for all I give a fuck. We're talking if they're the first. They're not.

Are they the first to embed a LLM in an OS? Yes. A model with over x Bn params? Maybe, probably.

But they ARE NOT the first to deploy gen AI on mobile.

[-] evo@sh.itjust.works 1 points 10 months ago

You're just moving the goal posts. I ran an LLM on device in an Android app I built a month ago. Does that make me first to do it? No. They are the first to production with an actual product.

[-] avidamoeba@lemmy.ca 8 points 10 months ago

I'm afraid of what I'll do with Smart reply in Gboard. 🥹

[-] evo@sh.itjust.works 4 points 10 months ago

At a glance I was confused/angry why this would only be for the Pixel 8 Pro and not the standard Pixel 8 considering they both have the same Tensor G3.

However, (from my own testing) it seems very likely the full 12 GB of ram the Pro has (vs the 8GB in the Pixel 8) is needed for some of these tasks like summarization.

this post was submitted on 06 Dec 2023
41 points (100.0% liked)

Android

17603 readers
59 users here now

The new home of /r/Android on Lemmy and the Fediverse!

Android news, reviews, tips, and discussions about rooting, tutorials, and apps.

🔗Universal Link: !android@lemdro.id


💡Content Philosophy:

Content which benefits the community (news, rumours, and discussions) is generally allowed and is valued over content which benefits only the individual (technical questions, help buying/selling, rants, self-promotion, etc.) which will be removed if it's in violation of the rules.


Support, technical, or app related questions belong in: !askandroid@lemdro.id

For fresh communities, lemmy apps, and instance updates: !lemdroid@lemdro.id

💬Matrix Chat

💬Telegram channels / chats

📰Our communities below


Rules

  1. Stay on topic: All posts should be related to the Android OS or ecosystem.

  2. No support questions, recommendation requests, rants, or bug reports: Posts must benefit the community rather than the individual. Please post to !askandroid@lemdro.id.

  3. Describe images/videos, no memes: Please include a text description when sharing images or videos. Post memes to !androidmemes@lemdro.id.

  4. No self-promotion spam: Active community members can post their apps if they answer any questions in the comments. Please do not post links to your own website, YouTube, blog content, or communities.

  5. No reposts or rehosted content: Share only the original source of an article, unless it's not available in English or requires logging in (like Twitter). Avoid reposting the same topic from other sources.

  6. No editorializing titles: You can add the author or website's name if helpful, but keep article titles unchanged.

  7. No piracy or unverified APKs: Do not share links or direct people to pirated content or unverified APKs, which may contain malicious code.

  8. No unauthorized polls, bots, or giveaways: Do not create polls, use bots, or organize giveaways without first contacting mods for approval.

  9. No offensive or low-effort content: Don't post offensive or unhelpful content. Keep it civil and friendly!

  10. No affiliate links: Posting affiliate links is not allowed.

Quick Links

Our Communities

Lemmy App List

Chat and More


founded 1 year ago
MODERATORS