Pretty sure this is a bug in either discover or flatpak. My guess is flatpak has the 2 versions it feeds discover swapped, so the versions appear swapped, but in reality it will be fine
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
Okay! Thanks for the reply. :)
Sounds like https://bugs.kde.org/show_bug.cgi?id=465864
Aaaand solved! Thanks for this precise reply! Pretty sure its exactly that.
edit: turns out I'm wrong. google does index lemmy pages. it's just not easy to find them through google
~~I dont think its possible to find any lemmy posts through Google~~
Nah, we already are on google but it depends on a lot of factors. A lemmy frontend is just another webpage so google will crawl it if you allow it. So if an instance disallows it specifically or has incorrect/unfamiliar sitemap it might not work.
hmm, maybe that's the case for lemmy.world. I wasn't able to find anything from my comment or post history, even though I copied it exactly into the search.
edit: just had the idea to put quotation marks around the thing I copied to search for that exact string, and now I found it. Yeah, it really just seems like lemmy pages are not very popular results so google pushes them all the way to the bottom.
Exactly. The reason is that google favors pages that play the algorithm instead of actual content. That and popularity.
The major difference between lemmy and reddit is that there's many instances for search engines to crawl, compared to a single reddit.com. They likely treat each instance seperately, which leads to a lot of duplicate content and most of lemmy isn't search engine optimized.
Sadly I don't see a better way to do it than for search engines to be optimized for this kind of federated platforms. It's not obvious from the outside which is the preferred instance to show to a user.
I've had some luck finding content on lemmy by forcing a specific instance using site:lemmy.instance.domain
, but it depends on the search engine whether it's respected.
yeah good point about multiple domains
Honestly a shame. R*ddit is full of helpful information, as is Lemmy, but the latter is not indexed.
I just tried searching "element lemmy" and got the article Lemmy: Fans call for periodic table element to be named after Motörhead frontman
Whereas "element reddit" gives /r/elementchat/
Lemmy is indexed on Google as using the site:
operator will show, e.g. "rust site:programming.dev" gives sensible results, but there's not a way to search across Lemmy. Well, not with Google anyway (Kagi has a Fediverse lens that works fairly well).
I couldn't find it in my comment history, but I saw a thread months ago where someone was lamenting migrating from reddit where they used to just google "episode ### discussion" for the show they're watching and would find a corresponding reddit thread, but the same thing wasn't working for them with Lemmy. Someone else pointed out that it might be because Google personalises some of the search results now, so I tried their example query and the top link was to the post I was commenting on. It had already indexed to the most relevant result about an hour after the original post
It is called "downgrading", and it is not uncommon to have some packages downgrading when updating/upgrading a system, due to several reasons.
Under what circumstances? I don't think I've ever seen a package downgraded during an upgrade.
I somehow missed this to be a flatpak via Discover. Granted this may not be usual in distros with a traditional update model, downgrading packages may be present in rolling distros, or distros with overlapping minor versions, or having 3rd party repos providing conflicting packages to those of the distro.
I offer my system as example:
The following product is going to be upgraded:
openSUSE Tumbleweed 20240211-0 -> 20240313-0
The following 14 packages are going to be downgraded:
ghc-binary ghc-containers ghc-deepseq ghc-directory ghc-exceptions ghc-mtl ghc-parsec ghc-pretty ghc-process ghc-stm ghc-template-haskell ghc-text ghc-time ghc-transformers
No. This is just a thing Discover does. Unless nearly every update I've done for every Flatpak I have installed on my Steam Deck have actually been downgrades.
No, it's just a (long fixed!) bug. In the case of the Deck, the next version of SteamOS comes with the fix soon... in the case of Debian, they don't ship our bugfix releases, so it'll be stuck with this until Debian 13 :/
As someone else pointed out, its a bug and mentioned on the kde bugtracker.