Google, Amazon buy nonexistent mini nuclear reactors for AI data centers
  • blakestacey blakestacey 3d ago 100%

    Your blithe trust in capital fails to be endearing.

    10
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
  • blakestacey blakestacey 3d ago 100%

    alignment offer

    That's sure a choice of words, ain't it

    13
  • Google, Amazon buy nonexistent mini nuclear reactors for AI data centers
  • blakestacey blakestacey 4d ago 100%

    as you will discover if you read,

    Hey, man, that's a lot to ask, you know, brah

    11
  • Google, Amazon buy nonexistent mini nuclear reactors for AI data centers
  • blakestacey blakestacey 4d ago 100%

    Google has signed a deal with California startup Kairos Power for six or seven small modular reactors. The first is due in 2030

    So, well after the bubble is going to pop.

    17
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
  • blakestacey blakestacey 5d ago 100%

    Good sneer from "Internet_Janitor" a few comments up the page:

    LLMs inherently shit where they eat.

    18
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
  • blakestacey blakestacey 6d ago 100%

    The only argument I find here against it is the question of whether someone’s personal opinions should be a reason to be removed from a leadership position.

    What do these people think leadership is?

    14
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
  • blakestacey blakestacey 6d ago 100%

    (wipes away a single tear)

    Beautiful, man, just beautiful

    7
  • LLMs can’t reason — they just crib reasoning-like steps from their training data
  • blakestacey blakestacey 6d ago 93%

    (Preface: I work in AI)

    Preface: repent for your sins in sackcloth and ashes.

    IMO, LLM’s are what they are, a good way to spit information out fast.

    Buh bye now.

    13
  • LLMs can’t reason — they just crib reasoning-like steps from their training data
  • blakestacey blakestacey 6d ago 100%

    What if I told you I have the power to ban you from the forum because you're terminally boring?

    15
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
  • blakestacey blakestacey 6d ago 100%

    According to the Wikipedia article on Bell's theorem, the physicist Tsung-Dao Lee "came close to" inventing it independently. Word from people who would know is that he actually got it, deriving a no-go theorem for local hidden variables that was basically the same as the one Feynman gave much later. But the only documentation for this is in e-mails that have never been published, so nobody can fix the Wikipedia page.

    9
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
  • blakestacey blakestacey 6d ago 100%

    yeah, 3b1b animations can take you through all of undergrad math in probably a month if it all existed and you used anki

    We could bottle this arrogance and sell it as an emetic.

    And besides, we all know that mathematics videos peaked with the Angle Dance.

    8
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
  • blakestacey blakestacey 6d ago 100%

    Fun fact: The plain vanilla physics major at MIT requires three semesters of quantum mechanics. And that's not including the quantum topics included in the statistical physics course, or the experiments in the lab course that also depend upon it.

    Grad school is another year or so of quantum on top of that, of course.

    (MIT OpenCourseWare actually has fairly extensive coverage of all three semesters: 8.04, 8.05 and 8.06. Zwiebach was among the best lecturers in the department back in my day, too.)

    10
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
  • blakestacey blakestacey 7d ago 100%

    This is just straight-up gossip, but why not:

    Tegmark used to go around polling physicists at conferences about which interpretation of quantum mechanics they prefer. A colleague of mine said that they were sitting near Tegmark and saw him fudging the numbers in his notes — erasing the non-Many Worlds tallies from those who said they supported Many Worlds as well as others, IIRC.

    9
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
  • blakestacey blakestacey 1w ago 100%

    Max Tegmark has taken a break from funding neo-Nazi media to blather about Artificial General Intelligence.

    As humanity gets closer to Artificial General Intelligence (AGI)

    The first clause of the opening line, and we've already hit a "citation needed".

    He goes from there to taking a prediction market seriously. And that Aschenbrenner guy who thinks that Minecraft speedruns are evidence that AI will revolutionize "science, technology, and the economy".

    You know, ten or fifteen years ago, I would have disagreed with Tegmark about all sorts of things, but I would have granted him default respect for being a scientist.

    16
  • LLMs can’t reason — they just crib reasoning-like steps from their training data
  • blakestacey blakestacey 1w ago 100%

    Rooting around for that Luke Skywalker "every single word in that sentence was wrong" GIF....

    17
  • How ChatGPT nearly destroyed my wedding day
  • blakestacey blakestacey 1w ago 100%

    "Comment whose upvotes all come from programming dot justworks dot dev dot infosec dot works" sure has become a genre of comment.

    25
  • Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awful.systems sub may be subsneered in this subthread, techtakes or no. If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high. > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them. [Last week's thread](https://awful.systems/post/2447490) (Semi-obligatory thanks to @dgerard for [starting this](https://awful.systems/post/1162442))

    40
    307

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awful.systems sub may be subsneered in this subthread, techtakes or no. If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high. > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them. [Last week's thread](https://awful.systems/post/2391466) (Semi-obligatory thanks to @dgerard for [starting this](https://awful.systems/post/1162442))

    32
    165

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awful.systems sub may be subsneered in this subthread, techtakes or no. If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high. > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them. [Last week's thread](https://awful.systems/post/2334840) (Semi-obligatory thanks to @dgerard for [starting this](https://awful.systems/post/1162442))

    27
    200

    Time for some warm-and-fuzzies! What happy memories do you have from your early days of getting into computers/programming, whenever those early days happened to be? When I was in middle school, I read an article in *Discover* Magazine about "artificial life" — computer simulations of biological systems. This sent me off on the path of trying to make a simulation of bugs that ran around and ate each other. My tool of choice was PowerBASIC, which was like QBasic except that it could compile to .EXE files. I decided there would be animals that could move, and plants that could also move. To implement a rule like "when the animal is near the plant, it will chase the plant," I needed to compute distances between points given their *x*- and *y*-coordinates. I knew the Pythagorean theorem, and I realized that the line between the plant and the animal is the hypotenuse of a right triangle. Tada: I had invented the distance formula!

    19
    24

    So, here I am, listening to the *Cosmos* soundtrack and strangely not stoned. And I realize that it's been a while since we've had [a random music recommendation thread](https://awful.systems/comment/3413334). What's the musical haps in your worlds, friends?

    22
    39

    Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh facts of Awful you’ll near-instantly regret. Any awful.systems sub may be subsneered in this subthread, techtakes or no. If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high. > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > >Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    31
    208
    https://www.youtube.com/watch?v=TtVJ4JDM7eM

    Bumping this up from the comments.

    34
    2
    "Initials" by "Florian Körner", licensed under "CC0 1.0". / Remix of the original. - Created with dicebear.comInitialsFlorian Körnerhttps://github.com/dicebear/dicebearME
    bless this jank blakestacey 4mo ago 100%
    503?

    Was anyone else getting a 503 error for a little while today?

    1
    0

    Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid! Any awful.systems sub may be subsneered in this subthread, techtakes or no. If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high. > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    31
    134

    Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid! Any awful.systems sub may be subsneered in this subthread, techtakes or no. If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high. > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    22
    103
    https://www.tumblr.com/neil-gaiman/750412770921611264/i-apologize-if-youve-been-asked-this-question

    > Many magazines have closed their submission portals because people thought they could send in AI-written stories. > > For years I would tell people who wanted to be writers that the only way to be a writer was to write your own stories because elves would not come in the night and do it for you. > > With AI, drunk plagiaristic elves who cannot actually write and would not know an idea or a sentence if it bit their little elvish arses will actually turn up and write something unpublishable for you. This is not a good thing.

    136
    13
    arstechnica.com

    > Tesla's troubled Cybertruck appears to have hit yet another speed bump. Over the weekend, dozens of waiting customers reported that their impending deliveries had been canceled due to "an unexpected delay regarding the preparation of your vehicle." > > Tesla has not announced an official stop sale or recall, and as of now, the reason for the suspended deliveries is unknown. But it's possible the electric pickup truck has a problem with its accelerator. [...] Yesterday, a Cybertruck owner on TikTok posted a video showing how the metal cover of his accelerator pedal allegedly worked itself partially loose and became jammed underneath part of the dash. The driver was able to stop the car with the brakes and put it in park. At the beginning of the month, another Cybertruck owner claimed to have crashed into a light pole due to an unintended acceleration problem. Meanwhile, [layoffs](https://arstechnica.com/cars/2024/04/tesla-to-lay-off-more-than-10-percent-of-its-workers-as-sales-slow/)!

    55
    1
    https://www.404media.co/google-books-is-indexing-ai-generated-garbage/

    > Google Books is indexing low quality, AI-generated books that will turn up in search results, and could possibly impact Google Ngram viewer, an important tool used by researchers to track language use throughout history.

    43
    2
    https://futurism.com/the-byte/elon-musk-boring-company-tunnel-sludge

    [[Eupalinos of Megara](https://calteches.library.caltech.edu/4106/1/Samos.pdf) appears out of a time portal from ancient Ionia] Wow, you guys must be really good at digging tunnels by now, right?

    259
    40
    https://themarkup.org/news/2024/03/29/nycs-ai-chatbot-tells-businesses-to-break-the-law

    > In October, New York City announced a plan to harness the power of artificial intelligence to improve the business of government. The announcement included a surprising centerpiece: an AI-powered chatbot that would provide New Yorkers with information on starting and operating a business in the city. > > The problem, however, is that the city’s chatbot is telling businesses to break the law.

    125
    9

    a lesswrong: [47-minute read](https://www.lesswrong.com/posts/pzmRDnoi4mNtqu6Ji/the-cognitive-theoretic-model-of-the-universe-a-partial) extolling the ambition and insights of Christopher Langan's "CTMU" a science blogger back in the day: [not so impressed](http://www.goodmath.org/blog/2011/02/11/another-crank-comes-to-visit-the-cognitive-theoretic-model-of-the-universe/) > [I]t’s sort of like saying “I’m going to fix the sink in my bathroom by replacing the leaky washer with the color blue”, or “I’m going to fly to the moon by correctly spelling my left leg.” Langan, incidentally, is [a 9/11 truther, a believer in the "white genocide" conspiracy theory and much more besides](https://rationalwiki.org/wiki/Christopher_Langan).

    17
    17

    Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid! Any awful.systems sub may be subsneered in this subthread, techtakes or no. If your sneer seems higher quality than you thought, feel free to cut'n'paste it into its own post, there’s no quota here and the bar really isn't that high > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    24
    64

    If you've been around, you may know Elsevier for [surveillance publishing](https://golem.ph.utexas.edu/category/2021/12/surveillance_publishing.html). Old hands will recall their [running arms fairs](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1809159/). To this storied history we can add "automated bullshit pipeline". In *[Surfaces and Interfaces](https://doi.org/10.1016/j.surfin.2024.104081),* online 17 February 2024: > Certainly, here is a possible introduction for your topic:Lithium-metal batteries are promising candidates for high-energy-density rechargeable batteries due to their low electrode potentials and high theoretical capacities [1], [2]. In *[Radiology Case Reports](https://doi.org/10.1016/j.radcr.2024.02.037),* online 8 March 2024: > In summary, the management of bilateral iatrogenic I'm very sorry, but I don't have access to real-time information or patient-specific data, as I am an AI language model. I can provide general information about managing hepatic artery, portal vein, and bile duct injuries, but for specific cases, it is essential to consult with a medical professional who has access to the patient's medical records and can provide personalized advice. Edit to add [this erratum](https://doi.org/10.1016/j.resourpol.2023.104336): > The authors apologize for including the AI language model statement on page 4 of the above-named article, below Table 3, and for failing to include the Declaration of Generative AI and AI-assisted Technologies in Scientific Writing, as required by the journal’s policies and recommended by reviewers during revision. Edit again to add [this article in *Urban Climate*](https://doi.org/10.1016/j.uclim.2023.101622): > The World Health Organization (WHO) defines HW as “Sustained periods of uncharacteristically high temperatures that increase morbidity and mortality”. Certainly, here are a few examples of evidence supporting the WHO definition of heatwaves as periods of uncharacteristically high temperatures that increase morbidity and mortality And [this one in *Energy*](https://doi.org/10.1016/j.energy.2023.127736): > Certainly, here are some potential areas for future research that could be explored. Can't forget [this one in *TrAC Trends in Analytical Chemistry*](https://doi.org/10.1016/j.trac.2023.117477): > Certainly, here are some key research gaps in the current field of MNPs research Or [this one in *Trends in Food Science & Technology*](https://doi.org/10.1016/j.tifs.2024.104414): > Certainly, here are some areas for future research regarding eggplant peel anthocyanins, And we mustn't ignore [this item in *Waste Management Bulletin*](https://doi.org/10.1016/j.wmb.2024.01.006): > When all the information is combined, this report will assist us in making more informed decisions for a more sustainable and brighter future. Certainly, here are some matters of potential concern to consider. The authors of [this article in *Journal of Energy Storage*](https://doi.org/10.1016/j.est.2023.109990) seems to have used GlurgeBot as a replacement for basic formatting: > Certainly, here's the text without bullet points:

    58
    27
    https://paulgraham.com/best.html

    In which a man disappearing up his own asshole somehow fails to be interesting.

    18
    21

    So, there I was, trying to remember the title of a book I had read bits of, and I thought to check [a Wikipedia article](https://en.wikipedia.org/wiki/History_of_quantum_mechanics) that might have referred to it. And there, in "External links", was ... "Wikiversity hosts a discussion with the Bard chatbot on Quantum mechanics". How much carbon did you have to burn, and how many Kenyan workers did you have to call the N-word, in order to get a garbled and confused "history" of science? (There's a *lot* wrong and even self-contradictory with what the stochastic parrot says, which isn't worth unweaving in detail; perhaps the worst part is that its statement of the uncertainty principle is a [blurry JPEG](https://www.newyorker.com/tech/annals-of-technology/chatgpt-is-a-blurry-jpeg-of-the-web) of the average over all verbal statements of the uncertainty principle, most of which are wrong.) So, a mediocre but mostly unremarkable page gets supplemented with a "resource" that is actively harmful. Hooray. Meanwhile, over in [this discussion thread](https://awful.systems/comment/2190355), we've been taking a look at the Wikipedia article [Super-recursive algorithm](https://en.wikipedia.org/wiki/Super-recursive_algorithm). It's rambling and unclear, throwing together all sorts of things that somebody somewhere called an exotic kind of computation, while seemingly not grasping the basics of the ordinary theory the new thing is supposedly moving beyond. So: What's the worst/weirdest Wikipedia article in your field of specialization?

    21
    16