The main points:

  • Apparently, the direct cause was Japan raising its interest rates. Apparently investors globally used to borrow yen (which had low interest rates) and then invest elsewhere, turning a quick profit on the difference between the yen’s interest rate and the return of the investment. When the yen’s interest rate went up, a bunch of investors started selling off their yen assets, which carried over to the US market.

  • The issue was exacerbated by recent reports of US economic shrinkage.

  • Most stocks and all major indexes have dropped significantly

  • Tech companies particularly affected. Especially Nvidia, Apple and Tesla. Also, US-based and Taiwan-based chip manufacturers.

  • Cryptos are crashing, as investors are liquidating assets

  • Japanese and Korean stock markets were also severely affected directly, with similar downwards spirals.

  • European markets started getting affected as well.

I have no idea about the yen actually being the culprit to this. I’d say it’s probably a contributing factor or a catalyst. But we’ve been seeing the US tech companies faltering for some time now, after the US initiated a trade war with China. Chinese tech companies have been making huge strides in the past year alone, while Western tech companies remained stagnant or regressed.

My interpretation, at least as far as the fire-sale involving US tech companies, is that they’ve been losing ground to China for some time, and they’ve been underperforming in the stock market for a while (recall that Nvidia’s stock has been dropping for the past 2 weeks or so). Whatever arbitrary event caused Wall Street investors to start dumping their stocks, the recent poor performance of American tech companies made them a prime target for unloading stocks first. In essence, the US tried to start a tech and trade war with China and ended up shooting its own foot. Meanwhile, China yet again proven to be taking the right actions.

Worth noting, that there’s talks that the US Federal Reserve could have taken actions to prevent this crash being so severe, but they didn’t. The Fed says there’s still time to act and there’s nothing to worry about, without elaborating further.

This is entirely my own speculation, but it’s quite possible that a crash was expected and was allowed to unfold to blow up in Trump’s face when elected. It just happened a few months earlier than expected.

Edit: I’m not an economist, and I’m not involved in finances, so if anyone would like to correct me on anything, feel free. Also, apologies for using CNBC, but it was the only place I found that listed the events neatly, without dressing them up (and with minimal intrusion of Cookies notifications)

  • knightly [none/use any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    9
    ·
    3 months ago

    Large language models suck. The tech is stagnant because there’s no new training data or tweak to the model that could possibly resolve the structural issues.

    It’s going down just like crypto. Not to disappear forever, but to fade into the background where the only remaining users are scam artists and their marks.

    • loathsome dongeater@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      I don’t understand the argument that there is little new data. There already is so much data to train them on. My guess is that if the technology was hypothetically much more advanced than it is right now, and LLMs were what their peddlers market them as, then with the available data you could cover much much more use cases than are covered right now.

      • USSR Enjoyer@lemmygrad.ml
        link
        fedilink
        English
        arrow-up
        11
        ·
        3 months ago

        Human beings learn more from a just a tiny fraction of the input that LLMs require. It’s pretty convenient that tech bros always want to pump bigger and bigger datasets as a solution to the shittyness of LLMs, rather than admit humans are vastly more important, skilled, original, creative and interesting than a fucking gigawatt-sucking datacenter.

      • knightly [none/use any]@hexbear.net
        link
        fedilink
        English
        arrow-up
        10
        ·
        3 months ago

        You misunderstand, I’m not saying that there is no new data to train them with, I’m saying that they can add as much data as they want but it won’t solve the problems.

      • huf [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        8
        ·
        3 months ago

        they trained these things on shit found on the internet, right? but the internet is now AI-poisoned, you cant use it to train another generation again. well, you can but it’ll be even worse than the current ones.

        sure there’s still lots of unpoisoned data out there, but it’ll be a LOT more expensive and a LOT more work to gather it now.