• 0 Posts
  • 45 Comments
Joined 1 year ago
cake
Cake day: July 6th, 2023

help-circle

  • It’s funny, I’ve had an Android, a Nokia Windows Phone, and an iPhone, and Windows Phone was the only OS in which I didn’t open every single app through search. The utter lack of an app ecosystem definitely played a part, but I honestly don’t think either of the other two handle home screens/“app drawers” very well. Every modern social media platform/messenger/etc. is built around vertical continuous scrolling because it’s easier. Why is horizontal, paginated scrolling the default for home screens?





  • I have a Class 3 (28mph), it’s actually not too bad. That assumes the brakes are well-maintained, though, and as we know there are no inspections for e-bikes. I’ve seen some terrifyingly bad brakes on normal bicycles, so I can’t imagine what some people’s e-bikes look like.

    It should be mandatory for Class 2 and Class 3 e-bikes to have hydraulic disc brakes imo. I have mechanical disc brakes, and I have to tighten them at least once a month. It seems unwise to trust that the average person would also do that. Rim brakes are right out; they have nowhere near enough braking power for the speed and weight of most e-bikes.












  • Printing “this shit is milk” on a bottle is dirt cheap. It’s practically free. They probably already do it with the expiration date.

    Problem is, some bright-eyed fuckfuck at PepsiCo realized they could sell more shit using labels with no visible dot matrix and a color palette with vomit-inducing vibrancy and 69 million shades. Approximately 90 seconds later, everyone else decided that they need to wrap their plastic in some plastic to “stay competitive”. The industry collectively stuffed some lunch money in Ronald H. W. Gore’s titty pocket, and here we are, decades later, with a mountain of unrecyclable garbage that no one even knew couldn’t be recycled. And it’s not even their fault, for the same exact reason we don’t expect people to know not to lick the lead paint off their mid-20th century coffee mugs.



  • Obviously you can’t turn a person white so they probably mean the led.

    This is true, but it still has to distinguish between facetious remarks and genuine commands. If you say, “Alexa, go fuck yourself,” it needs to be able to discern that it should not attempt to act on the input.

    Intelligence is a spectrum, not a binary classification. It is roughly proportional to the complexity of the task and the accuracy with which the solution completes the task correctly. It is difficult to quantify these metrics with respect to the task of useful language generation, but at the very least we can say that the complexity is remarkable. It also feels prudent to point out that humans do not know why they do what they do unless they consciously decide to record their decision-making process and act according to the result. In other words, when given the prompt “solve x^2-1=0 for x”, I can instinctively answer “x = {+1, -1}”, but I cannot tell you why I answered this way, as I did not use the quadratic formula in my head. Any attempt to explain my decision process later would be no more than an educated guess, susceptible to similar false justifications and hallucinations that GPT experiences. I haven’t watched it yet, but I think this video may explain what I mean.

    Edit: this is the video I was thinking of, from CGP Grey.


  • I still don’t follow your logic. You say that GPT has no ability to problem solve, yet it clearly has the ability to solve problems? Of course it isn’t infallible, but neither is anything else with the ability to solve problems. Can you explain what you mean here in a little more detail.

    One of the most difficult problems that AI attempts to solve in the Alexa pipeline is, “What is the desired intent of the received command?” To give an example of the purpose of this question, as well as how Alexa may fail to answer it correctly: I have a smart bulb in a fixture, and I gave it a human name. When I say,” “Alexa, make Mr. Smith white,” one of two things will happen, depending on the current context (probably including previous commands, tone, etc.):

    1. It will change the color of the smart bulb to white
    2. It will refuse to answer, assuming that I’m asking it to make a person named Josh… white.

    It’s an amusing situation, but also a necessary one: there will always exist contexts in which always selecting one response over the other would be incorrect.