Haha, thanks for the correction. If you have to use your degree in ethics, perhaps you could add your perspective to the thread?
Haha, thanks for the correction. If you have to use your degree in ethics, perhaps you could add your perspective to the thread?
If you can get past the weird framing device, the Plinkett reviews of the Star Wars prequels are an excellent deep dive into the issues with those films: https://www.youtube.com/watch?v=FxKtZmQgxrI&list=PL5919C8DE6F720A2D
Jenny Nicholson’s videos are great, but her documentary on “The Last Bronycon” is special, as the realization dawns on you while watching that she has more connection to Brony culture than you might have guessed: https://www.youtube.com/watch?v=4fVOF2PiHnc
According to consequentialism:
From this perspective, the only issue one could have with deep fakes is the distribution of pornography which should only be used privately. The author dismisses this take as “few people see his failure to close the tab as the main problem”. I guess I am one of the few.
Another perspective is to consider the pornography itself to be impermissible. Which, as the author notes, implies that (1) is also impermissible. Most would agree (1) is morally fine (some may consider it disgusting, but that doesn’t make it immoral).
In the author’s example of Ross teasing Rachel, the author concludes that the imagining is the moral quandry, as opposed to the teasing itself. Drinking water isn’t amoral. Sending a video of drinking water isn’t amoral. But sending that video to someone dying of thirst is.
The author’s conclusion is also odd:
Today, it is clear that deepfakes, unlike sexual fantasies, are part of a systemic technological degrading of women that is highly gendered (almost all pornographic deepfakes involve women) […] Fantasies, on the other hand, are not gendered […]
For microcontrollers, quite often. Mainly because visibility is quite poor, you’re often trying to do stupid things, problems tend to be localized, and JTAG is easier than a firmware upload.
For other applications, rarely. Debuggers help when you don’t understand what’s going on at a micro level, which is more common with less experience or when the code is more complex due to other constraints.
Applications running in full fledged operating systems often have plenty of log output, and it’s trivial to add more, formatted as you need. You can view a broad slice of the application with printouts, and iteratively tune those prints to what you need, vs a debugger which is better suited for observing a small slice of the application.
Cool, you posted the original with the Tim Minchin callout.
The approach requires multiple base stations, each in the path of a ray which is detected at both the station and receiver, and the receiver’s position can only be known if there is communication with the stations.
That reminds me of a joke.
A museum guide is talking to a group about the dinosaur fossils on exhibit.
“This one,” he says, “Is 6 million and 2 years old.”
“Wow,” says a patron, “How do you know the age so accurately?”
“Well,” says the guide, “It was 6 million years old when I started here 2 years ago.”