llm only exist because of surveillance capitalism. no ads also means no ai.
I exist or something probably
llm only exist because of surveillance capitalism. no ads also means no ai.
it does matter, though
a majority of voters, but it still only ends up being about 33% of the country in literal terms.
that’s not how that works, we cant regrow (most) vital organs (liver says hi) because of “engineering problems” not because evolution is random. we personify adaptations to understand them, it can lead to issues but yours is a massive overcorrection.
there is not a single thing that could wipe out a deep sea habitat that wouldnt also wipe any space colonies. but i dont see anybody arguing for that, despite being far more achievable and practical. also, there is no feasible way for space colonies to be self sufficient anywhere in the near future, so wiping out earth also wipes out space colonies relying on it for supplies. this argument aboOt survivability is absurd.
the us largely does not charge for bags directly, they are a consumable that is part of the store’s customer overhead. At cost each bag is around 3 cents, and probably holds 15 to 50 dollars of merchandise that is being sold at around 2% net profit generally.
people often keep the bags and use them for other stuff, like trash bags or plastic linings or makeshift gloves. not everyone does. it’s wasteful, yes, though on net carbon impact it’s probably lower than plastic reusable bags and many plant fabric ones given a plastic industry exists anyway.
they are wire guided and far more stable than the similarly wire guided rockets.
is this a play on luddites?
i can give this class for free: dont
a consultant trying to make money off of teaching managers how to “manage people using ai”. this is very silly.
llm are no path to socialism and being tricked into believing that a small collection of ultrarich capitalists having ownership of middle and upper class jobs in a more literal sense is somehow going to bring that about is unfortunate. It’s neither here nor there, llm will never get there, but still unfortunate.
infinite patience to produce bullshit has extremely limited utility
us economists are hardly at threat of being disappeared for criticizing the economy
well you can define when the efficiency is impressive or not to you, now people can go look at the numbers. 20% is pretty substantial, but if you are disappointed it’s not 90% then i dont know what to tell ya
oh hey it’s that box from the chart. d&d is saved!
you’ve got it. the period of the sun up/sun down cycle would be the orbital period.
it’s not really a myth, see for yourself with the ugliest link ever: https://www.energystar.gov/productfinder/product/certified-residential-freezers/results?search_text=&sort_by=annual_energy_use_kwh_yr&sort_direction=asc&page_number=0&lastpage=0&search-1=&type_filter=Chest+Freezer&type_filter=Upright+Freezer&is_most_efficient_filter=0&capacity_total_volume_ft3_filter=7+-+13.9&capacity_total_volume_ft3_filter=14+-+21.9&markets_filter=United+States
you’ll notice that by capacity chest freezers are more efficient. There are a lot of factors stacked in their favor though:
via statistical imitation. other methods, such as solving and implementing by first principles analytically, has not been shown to be np hard. the difference is important but the end result is still no agigpt in the foreseeable and unforeseeable future.
the limitation is specifically using the primary machine learning technique, same one all chatbots use at places claiming to pursue agi, which is statistical imitation, is np-hard.
i dont think you are grasping the absolute scale of cash injection necessary to make llm even appear vaguely tenable as a product.
i feel you are confused. internal review boards, of which there are many, regularly allow human trials. they are necessary for the fda’s approval as well. there are tons of ways for patients to access ethically reviewed experimental treatments, but terminal patients are extremely likely to. you are correct that experimenting on onesself is often far less troubling though