I didn’t get to talk to the owner.
I didn’t get to talk to the owner.
I live in Germany, and I spotted one of these trucks recently. It looked huge compared to every other vehicle on the road, and one of those was a delivery van. And it was too big for its parking spot. It also had a confederate flag in the back window.
Self control is a road to many abilities, some of which are considered unnatural, such as not drinking coffee. My personal favorite is to connect to points in space time together, to create a hole to crawl from one side to the other, for example, to connect the university of Tubingen and Boulogne-sur-Mer.
I’m not sure we are discussing the same aspect of this mind experiment, and in particular the aspect of it that i find lovecraftian is that you may already be in the simulation right now. This makes the specific circumstances of our world, physics, and technology level irrelevant, as they would just be a solipsistic setup to test you on some aspect of your morality. The threat of eternal torture, on the other hand, would only apply to you if you were the real version of you, as that’s who the basilisk is actually dealing with. This works because you don’t know what of the two situations is your current one.
Wondering whether you are in a simulation or not is rather unproductive, as there’s basically nothing we can do about it regardless of what the answer is. It’s basically like wondering whether god exists or not. In the absence of clearly supernatural phenomena, the simpler explanation is that we are not in a simulation, as any universe which can produce the simulation is by definition at least as complex as the simulation. The definition I’m applying here is that the complexity of a string is its length or the length of the shortest program that produces it. Like, yes, we could be living in a simulation right now, and deities could also exist.
The song “Seele Mein” (engl: “My Soul” or “Soul is Mine”) is a about a demon who follows a mortal from birth to death and then carries off the soul for eternal torture. Interestingly, the song is from the perspective of the demon, and they gloss over the life of the mortal, spending more than half of the song on describing the torture. Could such demons exist? Certainly, there’s nothing that rules out their existence, but there’s also nothing indicating that they exist. So they probably don’t. And if you are being followed around by such a demon? Then you’re screwed. Theoretically, every higher being that has been though off could exist. A supercomputer simulating our reality falls squarely into the category of higher being. Unless we observe things are clearly caused by such a being, wondering about their existence is pointless.
The idea behind Roko’s Basilisk is as follows: Assume a good AGI. What does that mean? An AGI that follows human values. And since the idea originated on Less Wrong, this means utilitarianism. And it also means that we’re dealing with a superintelligence, since on Less Wrong, it’s generally assumed that we’re going to see a singularity once true AGI is reached. Because the AGI will just upgrade itself until its superintelligent. Afterwards it will bring about paradise, and thus create great value. The idea is now that it might be prudent for the AGI to punish those who knew about it, but didn’t do everything in their power to bring it to existence. Through acausal trade, the this would cause the AGI to come into existence sooner, as the people would work harder to bring it into existence for fear of torture. And what makes this idea a cognitohazard is that by just knowing about it, you make yourself a more likely target. In fact, people who don’t know about it, or dismiss the idea are safe, and will find a land of plenty once the AGI takes over.
Of course, if the AGI is created in, let’s say, 2045, then nothing the AGI can do will cause it to be created in 2044 instead.
Roko’s Basilisk hinges on the concept of acausal trade. Future events can cause past events if both actors can sufficiently predict each other. The obvious problem with acausal trade is that if you’re the actor B in the future, then you can’t change what the actor A in the past did. It’s A’s prediction of B’s action that causes A’s action, not B’s action. Meaning the AI in the future gains literally nothing by exacting petty vengeance on people who didn’t support their creation.
Another thing Roko’s Basilisk hinges on is that a copy of you is also you. If you don’t believe that, then torturing a simulated copy of you doesn’t need to bother you any more than if the AI tortured a random innocent person. On a related note, the AI may not be able to create a perfect copy of you. If you die before the AI is created, and nobody scans your brain (Brain scanners currently don’t exist), then the AI will only have the surviving historical records of you to reconstruct you. It may be able to create an imitation so convincing that any historian, and even people who knew you personally will say it’s you, but it won’t be you. Some pieces of you will be forever lost.
Then a singularity type superintelligence might not be possible. The idea behind the singularity is that once we build an AI, the AI will then improve itself, and then they will be able to improve itself faster, thus leading to an exponential growth in intelligence. The problem is that it basically assumes that the marginal effort of getting more intelligent grows slower than linearly. If the marginal difficulty grows as fast as the intelligence of the AI, then the AI will become more and more intelligent, but we won’t see an exponential increase in intelligence. My guess would be that we’d see a logistical growth of intelligence. As in, the AI will first become more and more intelligent, and then the growth will slow and eventually stagnate.
You get a notification about sight stealer shrieks, so you’re going to be forewarned. Problem is that once they’re on the map, they act like normal raiders who are invisible until they are about to attack somebody.
I haven’t seen an sightstealer retreat so far. Are you sure you’re not referring to a revenant, instead?
Resisting Trump is a crime to these people.
Just as fast as a car if you run as fast as a car.
Eh, Levels bring a linear increase in strength and durability, while an effective attack doubles your damage output. So you’d need twice your opponents level to make up for type disadvantage. Of course, that’s assuming you’re fighting against a pokemon controlled by a human player. However, wild pokemon can’t take full advantage of their type advantage.
Yes, that’s the idea. I won’t need to destroy the sunblocker, because the crops will die anyway.
Thanks for your reply. Are his insurance premiums going to go up?
What about the guy who’s space yacht you stole. Was he another player or an NPC? If he was another player, will he have to buy a new space yacht for real money?
Umm … That AI generated hentai on the page of the same article, though … Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.
The image depicts mature women, not children.
Yes, I made the ritual description up for a joke. I’ve never performed a human sacrifice.
When sacrificing the child, use a dagger made from obsidian. Cut upward from below the sternum, then force the rib cage apart. Push the lungs aside with your hands, then cut out the heart with your ritual dagger. Hold the heart up to the cheering crowd, and then place it in an earthen vessel in honor of the gods. Kick the body down the steps of the temple pyramid.
The thing with live services is, they take so much of the user’s time that there can only be a handful of successful live service games at a time. So any company that thinks that they can just push out a live service game and make tons of money is mistaken. Of course, any CEO who doesn’t want to make live service games will need to explain to their shareholders why not. Easy explanation when you’re a small company, as they can just say that they don’t have the manpower needed. But a big company doesn’t have that excuse.
How restrictive do you want to be with the accounts? If you’re too restrictive, there won’t be enough users. If you’re not restrictive enough, the data will be used for AI training.
Why is the cop so chubby? That vest he’s wearing looks like its about to tear open any minute.
Yes, the idea is good, I just don’t trust AI to do a good job.