• 0 Posts
  • 55 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle





  • The “front” or “forward” direction of a screw is clearly the face of the fastener itself, be it a hex head, Phillips, or Slotted screw. Picking a side of a face as the front doesn’t make any sense. The whole thing needs to rotate one direction or another, and it will either rotate to the right to tighten, or the left to loosen.

    If I ask you what the front of a clock is, are you going to tell me it’s the top curve near the ceiling? No it’s the face of the clock, and the hands rotate around it to the right.




  • What the fuck are you talking about.

    You’re either rotating the fastener to the right or the left.

    It doesn’t matter what side you’re talking about, because you’re not moving one side of the fastener, you’re rotating the whole thing one direction or the other.

    Clockwise just means something is rotating to the right.

    If I ask you to turn around to the right, are you going to ask me what side of you I’m referencing?


    • we invented the modern car, y’all are driving on the wrong side of the road
    • a switch is a switch, if you don’t like the direction it goes, just flip it over and put the cover back on, half the switches in my house go one way, and the other go the opposite way, some of them are even sideways.
    • y’all sank the ship that had the US copy of the metric standard on it, and also invented the imperial system in the first place.
    • if you look at the entomology, the US largely uses the original English pronunciation of things, it’s the British who have slowly changed their pronunciation over the centuries. We did have a guy who intentionally changed a bunch of the spelling, you are right about that.




  • I didn’t bring up Chinese rooms because it doesn’t matter.

    We know how chatGPT works on the inside. It’s not a Chinese room. Attributing intent or understanding is anthropomorphizing a machine.

    You can make a basic robot that turns on its wheels when a light sensor detects a certain amount of light. The robot will look like it flees when you shine a light at it. But it does not have any capacity to know what light is or why it should flee light. It will have behavior nearly identical to a cockroach, but have no reason for acting like a cockroach.

    A cockroach can adapt its behavior based on its environment, the hypothetical robot can not.

    ChatGPT is much like this robot, it has no capacity to adapt in real time or learn.


  • You’re the one who made this philosophical.

    I don’t need to know the details of engine timing, displacement, and mechanical linkages to look at a Honda civic and say “that’s a car, people use them to get from one place to another. They can be expensive to maintain and fuel, but in my country are basically required due to poor urban planning and no public transportation”

    ChatGPT doesn’t know any of that about the car. All it “knows” is that when humans talked about cars, they brought up things like wheels, motors or engines, and transporting people. So when it generates its reply, those words are picked because they strongly associate with the word car in its training data.

    All ChatGPT is, is really fancy predictive text. You feed it an input and it generates an output that will sound like something a human would write based on the prompt. It has no awareness of the topics it’s talking about. It has no capacity to think or ponder the questions you ask it. It’s a fancy lightbulb, instead of light, it outputs words. You flick the switch, words come out, you walk away, and it just sits there waiting for the next person to flick the switch.