LingQ's Most Important Feature is Broken

Right now, when I click on any word in the reader, LingQ does not bring up the definition. The following screenshot illustrates.

This is the case for any word. It appears to be completely broken. For every word in my French language vocabulary, partially known to known, LingQ is literally prompting me to “type a new meaning here.”

To be able to quick 1-click look up the definition of a word and move on with consuming more comprehensible input is the #1 feature of LingQ.

I believe I’ve seen others post similar here in the support forum.

Where is this issue on LingQ’s priority list at present? What’s the latest status?

In my line of work, when your number one feature is broken, it’s referred to as a “P1,” meaning of highest priority and there’s “war room” in response, and the highest level execs are present because stakes involved.

6 Likes

Hi gmeyer. Are you doing this on browser or app, and which browser or app? I’m not seeing in android app or browser (Edge) atm. So maybe an issue on iOS or a different browser?

1 Like

Eric, I’m kinda disappointed you’re asking this question. Your log files and support tools should have this information and be available to support staff to use. You need a support UI to search for “gmeyer” to answer that question in the context of seeing my navigation history in the app.

It’s answer would be:

[Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36]

That’s my browser’s user agent. That header is given to your server with every request made.

2 Likes

My bad, now I see what you mean with the marked Known words:


LingQ Dev Team: It would be better to keep the definition visible for all who have logged the meaning of a given word regardless of status. At the very least allow the user to be able to click the down arrow (:arrow_down_small: ) to expand and see their saved meanings for that known word (under a marked known word popup).

Edit 3: Though I acknowledge that this may have been an intentional design to allow users to more easily type in a new meaning/context for a particular word that they know, over seeing old meanings/contexts. It would make it easier to prevent duplicate meanings by still allowing users to see their saved meanings in this view (under a marked known word popup in the image above. Either by using the expansion arrow or by displaying it from the start of the click pop up menu.)


What it seems like is happening, is that LingQ is deleting the saved meaning from a user’s known word list, which is odd, if the user needs to mark it as 1-3/4 again because they forgot what it means over time.

In this case, would the user have to look up the word again just to study with it/or show it again? Without the past contexts they encountered a particular word in?

Edit: The saved meaning(s) reappear when you change the status of the word to 4 or below, so it’s still there in the system:


Edit 2: If you activate a the Side Bar view you can see the Saved Meaning for a Known status word:


Note: Using LingQ via Firefox on a MacOS.

2 Likes

This is only happening if you do not have the sidebar activated. If you have the sidebar positioned on the right, then the definitions appear.

The definitions also appear in floating mode, but you have to expand the window and then use the arrow to show the dictionaries and meanings.

3 Likes

SeoulMate is right. Thank you for the tip @SeoulMate!

Here is a picture of that same word screenshot I posted, but with the sidebar activated:

And after hitting the down arrow (:arrow_down_small: ) next to Saved Meaning:

1 Like

gmeyer, I don’t work for LingQ. Nor have I ever.

edit: I am a developer (for another company) and it is always more useful if the information about what browser, app, etc. Or what the user was doing specifically is provided. Sure that information could be dug up most likely, but it’s always appreciated if provided in the support ticket.

6 Likes

@ericb100 that’s what I was asking myself. :laughing: What? When? When it happened? :grin:

1 Like

First, I’d say I don’t really see reasons for either definitions of words changing or my knowledge of that word’s definition and usage changing with a sidebar’s activation.

A) The definitions of a word are independent of my knowledge of that word.

B) My knowledge of a word is dependent upon that word’s conventional definitions however independent of my usage of any language learning app and its features.

C) The features of the app should be based upon this. Simply, I see only confusion unnecessarily introduced to the language learning experience where in some usage context the word appears as known, with definition, and in other usage it appears as needing to have definition ascribed.

Second, it simply doesn’t work in the sidebar, either.

In addition to fixing the bug, I’m of the strong opinion that vocabulary management in LingQ needs to be substantially rethought in the era of generative AI.

Rather than having manual integration with traditional-method dictionaries and self-management of vocabularies, what I’d really like to see when simply clicking on a word is the response to such as what this ChatGPT prompt demonstrates.

As brief as possible, even possibly expressed as a single word or maybe two or possibly three, what is the English definition of the French word “télescope” as used in the following sentence:

Le dioxyde de carbone détecté sur une des lunes de Jupiter, Europe, provient d’un océan situé sous son épaisse couche de glace, selon des données du télescope spatial James Webb qui confortent les espoirs que cette eau cachée puisse abriter la vie.

ChatGPT responds with:

telescope

Here’s another example…

As brief as possible, even possibly expressed as a single word or maybe two or possibly three, what is the English definition of the French words “Le dioxyde de carbone” as used in the following sentence:

Le dioxyde de carbone détecté sur une des lunes de Jupiter, Europe, provient d’un océan situé sous son épaisse couche de glace, selon des données du télescope spatial James Webb qui confortent les espoirs que cette eau cachée puisse abriter la vie.

ChatGPT responds with:

carbon dioxide

Rather than asking me for my own personal definition of a word (with the “type a new meaning here” prompt) that I seem to have forgotten for a moment, I’d rather see a good, common, contextual definition for me like this.

IMO, more than about anything, LingQ should show me the definition of a word when I click on it.

1 Like

@gmeyer I tried on Chrome right now, with French as well and it seems to work. It might be browser depended.

1 Like

Ericb100, then my apologies. Yet in that context, please do recognize my respect for your knowledge and helpfulness.

(I guess five years ago and earlier, I thought it belonged in the support ticket. In 2023 and forward, I more so think it needs to be in the support team’s tool kit.)

3 Likes

No apology needed. Maybe you were confusing me with the eric that does work for LingQ, although, haven’t seen him around on the youtube or message boards lately.

also…even though the LingQ team may be able to search what browser, app, etc. your working in, if provided here, not only does it save time so they don’t have to go look it up (through millions of calls or logs), but it allows us fellow users to see if we can confirm the issue, or maybe discover a workaround (until it’s fixed), etc.

3 Likes

sorry @gmeyer, you constantly mention “AI” but do you know how technically it works? We discussed that months ago and I follow it since the beginning of the year.
Read this explanation about the grammar, please: https://forum.lingq.com/t/using-chatgpt-as-a-language-tutor/68306

1 Like

Based upon my experience it does work and yet it doesn’t. It depends upon what you ask of it and how you ask it. HUGE topic.

In this thread we’re talking about vocabulary and mostly that of individual words.

Here, I’ve found it cleaner that what LingQ has crowdsourced.

1 Like

yes, but that’s a problem too unfortunately. LingQ cannot use a dictionary as vocabulary for copyright reasons and definitely, we can’t trust ChatGPT because we don’t know from where it takes the information, and if that information is misleading.
Plus, ChatGPT doesn’t do so many correct translations, or have many correct definitions, for all cross languages LingQ has because the quantity of trained words is not the same for all languages. So many users wouldn’t have the same experience that you think they would, and you would need more different “AI” that are trained in different languages, to be integrated.

I personally write a lot of definitions, I mean tens of thousands, and I use more than one vocabulary when I can, because they are not even correct all the time. Depending on the language I prefer different dictionaries and tools.

With Lingq you just search your own definition (which is a waste of time, I know) or “trust” some other user definition (which I don’t in most of the cases!). I wouldn’t trust AI that is copying randomly sentences or definitions everywhere (without even paying anyone btw), but I would prefer, as we have discussed somewhere in another thread, to have the possibility to buy a real dictionary and integrate it into my vocabulary.

Plus, ChatGPT is not free, it costs money and LingQ is already expensive. They have added Whisper and TTS, which is fantastic but I don’t like the idea to “AI” being integrated in something that can be misleading or wrong. Otherwise, many users might fall into the trap to trust a lot of wrong information without even realising it, and this is just happening more and more.

However, I agree that continuous problems like you mentioned on the top shouldn’t happen, and definitely a better UX would be awesome.

Like Whisper and TTS were integrated, AI would be integrated as well, hoping when prices will be cheaper so we don’t have to pay more for it.
Imho.

1 Like

I think this is where you’re the exception.

I also think a sentence such as:

Ya le he dicho todo lo que quiero decirle.

is beyond a Turing Test. I wonder what percentage of native language speakers could correctly break down the grammatical structures on such a phrasing.

Additionally, I would suspect that for many such phrases that have a high “grammatical complexity to number of number of syllables” ratios, they are rather idiomatic and nuanced as integrated phrases and the tools of grammar start to lose utility.

Related, something I’ve only begun to start to explore with ChatGPT is when I ask it to break things down by morpheme and guide it through step 1, step 2, step 3, etc. of how I want it to do the analysis.

Anyhow, “all” is a really tough word. In English it seems to be adjective, adverb, noun, and pronoun.

I just looked up “tout” in my French vocabulary on LingQ. (Seems its definition had been deleted on the server!) When looking at the LingQ mappings to parts of speech and definitions, it’s worse than what ChatGPT produces for me.

3 Likes

Often, the individual dictionaries are “wrong” for a given context, or a word doesn’t even appears, or it shows one or two meanings of the word which don’t fit the given context. That’s why I often have to look at a few of the different dictionaries I have. Or, sometimes not even find it. The past few weeks I’ve been using chatgpt more. (version 4) and it seems mostly right and has been even helpful for some things I couldn’t find in the dictionary. Possibly because it’s a more colloquial or regional word (German). By the way version 4 correctly describes “todo” as a pronoun in the context of the sentence related to the earlier chatgpt thread. I think version 4 has improved quite a bit over the 3.5 one (my experience so far and general understanding)

So I think it would be helpful to use, especially in the beginning stages. I think it would be great to have, in sentence mode, in addtion to the “translate sentence” button, an “explain this sentence” button and get the grammar/meaning breakdown. The nice thing is it’ll break out the typical colocations. Often, it’s a little unclear what role a given word is playing in some sentences.

So, it might get some things wrong and ymmv depending on the language as well. You’re right though, cost would be a factor, but maybe it would be worth it. For now, I’ll copy and paste to chatgpt when I need a little help…taking with a grain of salt it might give occasional wrong answer. Just like the dictionaries, but overall I’ve found it just as good and often better, especially in colloquial situations.

2 Likes

yes, there is a huge difference on the number of billion of words and data both machines are trained.

One thing is to individually pay for an extra service 20€/month (version 4) and another is a 200€ upgrade on LingQ (no thank you). Without considering that there are still a lot of discussions on copyrights infringements from those machines, governments involved, different banned AI in different part of the worlds and so on.
If I were LingQ I would be more conservative before integrating it to the overall system. Things are improving but there would be a greater benefit on reducing some nasty bug and the overall UX than ChatGPT (that a user can in any case use anyway outside LingQ).

Imho.

PS: Don’t get me wrong, I use it too but only in certain occasions. At the end, the most of the time we need to read+listening, so I would invest those money to fix once and for all our user experience.

2 Likes

I definitely don’t disagree with your points davideroccato. FIrst and foremost my main concern is that Lingq is the best input based program out there and getting those features should always be the main focus.

2 Likes

Same here, I thought it was just me - I get the same on the web version in Chrome on Chromeos

1 Like