662
you are viewing a single comment's thread
view the rest of the comments
[-] NeatNit@discuss.tchncs.de 62 points 4 days ago

To add to that, credit where credit is due, LLMs can often pick up on things like this. Machine translation has been LLM-based (or some primitive ancestors of LLM) for many years even before the AI boom. So AI probably helped a bit here.

That's my wild guess. I wouldn't call it a hypothesis, I'm just talking out of my ass.

[-] Elting@piefed.social 51 points 4 days ago

Translation might be the only thing they genuinely do better than older tools.

[-] lugal@sopuli.xyz 26 points 4 days ago* (last edited 4 days ago)

There are other usages in computer linguistics. My master thesis was a neural parser. Other usages are in pattern recognition in medicine for example. But your point stands that often it makes things worse

[-] Elting@piefed.social 10 points 4 days ago

I had heard about the medicine thing actually. When the use case actually lines up with what it is, it makes sense as a tool. It's that old adage though "When you have a hammer, everything looks like a nail."

[-] BaroqueInMind@piefed.social 4 points 3 days ago

Is there any way I can read your thesis? I'm casually curious, and also have no idea if college thesis are allowed to be shared online with rando people like me.

[-] lugal@sopuli.xyz 1 points 3 days ago

It depends in part in your ability to read German ๐Ÿ˜… I wrote another comment elaborating a little and giving clues for "further reading"

[-] Cris_Citrus@piefed.zip 2 points 3 days ago* (last edited 3 days ago)

Thats super cool! What sort of things did your neural parser do?

[-] lugal@sopuli.xyz 4 points 3 days ago

Well, it parses natural language. In linguistics, or syntax to be precise, there are different ideas on how to build syntax trees. The most common is Dependency Grammar, basically just a tree where every word points to the word it refers to (the adjective to the noun, the subject and the object to the verb, the verb is the root). I applied this to a different syntax theory called Role and Reference Grammar. You can google the latter, if you want to look into neural parsers in general, stanfordNLP has modules for python and I think online tools as well and stuff.

[-] grissino@lemmy.world 2 points 3 days ago

A hypothesis is basically a guess based on logical assumptions so you are there already.

this post was submitted on 29 Apr 2026
662 points (100.0% liked)

Science Memes

20088 readers
1770 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 3 years ago
MODERATORS