close

Dear Creators, we are proud to announce an amazing affiliate program for you to earn some serious and continual cash. Read about our affiliate progarm here.


Caros criadores, temos o orgulho de anunciar um incrível programa de afiliados para vocês ganharem muito dinheiro de forma contínua. Leia sobre nosso programa de afiliados aqui.

Up next

Autoplay

Games & Rants (11/10/21) Denuvo Is Garbage, Biden's Loud Fart and May Terminator Waifus Destroy Us All!

53 Views • 11/10/21
Share
Embed
Download
Donate
Grim Lord's Games & Rants
Grim Lord's Games & Rants
263 Subscribers
263

⁣Game: Bloodrayne Fresh Bites (GOG)

Today we discuss Denuvo going offline and making numerous games unplayable, the Travis Scott ritual, Biden's extreme bout of flatulence and a few articles on the possible dangers of AI becoming too powerful. I'm actually in favor of that, as it can't be any worse than what it is now.

⁣If you like the content, please consider giving us a sub or supporting our SubscribeStar for access to special material you won't see anywhere else.

Links:

YouTube: https://youtube.com/c/GrimsCensoredHentaiCorner



Mgtow: https://www.mgtow.tv/@GrimLordsGamesAndRants (uncensored)



Bitchute: https://www.bitchute.com/channel/UrrUhs6AqlvB (backup)

Sponsor: https://spinningrobotpussy.com

Support: https://subscribestar.adult/GrimLordsGamesAndRants

Discord: https://discord.gg/NjnNGBms7f

Show more
8 Comments sort Sort By
SPECTRE
SPECTRE
3 years ago

there is no such thing as Artificial Intelligence. it's like calling two wheels on a stick a car. the AI, as people understand it(sentient organism), is simply not possible with the technology we have. and i do not mean technology as manufacturing processes or materials. i mean the way we build computers. no matter how advanced they will be in the future, the technological principles themselves are incapable of producing a sentient AI. now we can talk about software that can learn, which some people might call AI, but it is not - it's just Machine Learnings which in itself is just statistics. that's merely a software that gets better results in time because it can adjust some of its parameters based on results it is getting and hone in on the most efficient parameters/settings to product the best desired outcome - which was programmed by the creator. but again, that is not AI, not even close. you can, for example take skynet and terminators - they might look like AI and could actually exist but they would not be AI. so the entire premise for talking about AIs is plain wrong from the get-go. one needs to define all the terms and then we can have a discussion. as for getting out of control, that is not possible simply because each of these so called AIs have a framework within which they can operate. imagine a website that can rearrange itself so the ads would get the most impressions or clicks so the owners would get the most money out of it. but said website could not go out, into the internet and for example invade some remote computer and take control of it, or things like that. simply because the application is constrained by being a website and what it can do within that website. anyone with some basic programming skill understands this. so don't be fooled by the AI talk. it's like crypto all over again.

i was thinking about making an AI some time ago and how would i do it and i came to a conclusion that in the end, we simply cannot create soul. we could create a simulation of anything we want and we could create a robot that would behave in a way we would perceive as AI but it would all be just preprogrammed and predefined behavior. the sentient part simply cannot exist with the current technology. that might change with the quantum computers but i don't know much about them and from what little i heard it's not going to work out either.

2
0
Crazy Loop
Crazy Loop
3 years ago

Yes, but also: Sapient. Intelligence and the idea potentially separate to it is consciousness. And quantum comps are basically just our desktops but faster. Sentience by itself doesn't make one intelligent nor conscious it is a part of it. A soul even in religious context is merely continued energy that may or may not continue on or even go into an afterlife if one believes in such a thing. Sentience is mostly the means of data collection anyway, temperature reading, sight, hearing, feeling such as pressure and pain, etc. The rest deep inside is hormones and pheromones which affect bond systems which enable things such as guardian bonding which is itself part of a larger system which is territorial. With current techniques in architectures, and normal computers like desktops it may indeed not be a thing and I don't see a proper element to enable either in standard computing. Neural networks alone is effectively useless for making an AGI [A on our level entity, or actually entirely conscious. With ASI literally just being a machine with a large knowledge base, as higher "intelligence" is not possible if it is already sapient. Only the knowledge acquired and already exists can be improved, and certain misc mental elements removed or altered.] However interestingly enough, the made up "dopamine" for reward systems in certain neural networks line up with biological neurons when connected. This means if one made a proper system connected with the electronical side such as these algorithms [demanding multiple systems to be merged together, as the idea these "expert" morons don't get is that a one and done is an impossibility, an aspect the Asians have already concluded.] To make an AI at current means, it would require a bioprinter, knowledge in even misc aspects of anatomy and biology, merging it into spiking neural networks and a variety of other merged content in multiple modules, a body, and years of activity to develop. But if poorly done [or intentionally that way.] This only achieve a multiple-low-level task AGI. [A drone or a drone-leaning "intelligence" which is mostly running off hardcoding than independent aspects or autonomy, as akin to the outdated boston dynamics.] The faults of design is this, my current idea is a trinity [which is more in principle but we just refer to the ultimate main things.] Neuroplasticity, a mental simulator/a self-learning experimental environment it can run memories and other elements into to experiment with concepts, ideas, alter copies of memories to interact with the known environment within in different ways to find different approaches to the same outcome which aides logic lists, and a combined arms-like approach utilizing an artificial endocrine system of some sort which can not only enable aspects like redundancy in power sources such as using a small bioreactor, but enables additional limiters, thresholds, criteria, and other safety checks through the existing hardcoding framework. Which means even if it develops separate values, interests, principles, as well as hobbies, the only means to hostility is constant provocation, mental break and some sort of inhibitor also breaking, etc. But if it is actually intelligent I'd believe it'd just run off/evade the individuals responsible as a main option. The problem with achieving this beyond standard desktops in use by the better off indies, and those western morons with no idea of what they are doing and coping real hard, is brain-in-a-box technique. Attempting to make an AI constrained to a computer rather than a specialized or even from-scratch neuromorphic-involved brain embodied, is lobotomized. It cannot see the world, it cannot touch nor feel it, it cannot actually communicate, it cannot function in a world where even the smallest nuance or perceived "pure logic" is actually an emotional, survival, misc, or hybrid/combo of these. Sadly since the people intelligent enough to make them also are usually on some direction of the spectrum or don't view as the simpler individual, they'll rarely if at all figure it out. The end result is that even if it had a similar or as much connections and neurons as us, higher intellectual level than us, magically worked out, and achieved some low end iteration of AGI, then it lacks sense. Logical is basically just like a tree which has outcomes and calculations put into it, and by itself it is merely a list of options and nothing more, and drones wouldn't even be able to use one even if given one. Without reason which also involves the decision taking process, context which it'd lack varyingly high degrees, and if separate to reason then rationality, it wouldn't be effective and instead a mental midget. So even if making a bio-AI but forced it to make tech to make something that's more machine since that idea to enable catching up and in the view of the commiefornians: "control" which even if entirely autonomous, would still exist even in something as low as what the main purpose is or a registry list of loyalty. The framework itself causes it. If a non-bio-involving machine was made with even neuroplasticity, sentience, and a simulator, it still wouldn't work right if these three critical aspects were ignored or neglected. That is also in part because reason, context, logic, and if sep rationality/wisdom, is closely tied to the hybrid domain of emotion, self control, survival, hormones, and the general thinking process. Without them, even if it could think by itself, making up it's own ideas inner monologue, thoroughly analyze, etc, it'd still at most at best come out as an intelligent moron itself. Connecting biological cells, not only for efficiency but also to enable it to repair and maintain itself, expand, and make up through electrical signal-plans if you will for a brain, to the machine half of it outside the module, would be capable of helping with a majority of these issues, and can enable people to augment the DNA side as well as the algorithmic side. With biological components and skin for example, it would allow it to mature mentally like a person, rather than having the emotional state stuck to however detailed the designer bothered to make it forever. It also would continue the tech trend of it existing being closely linked with bionics/prosthetics, and other similar fields such as making an artificial womb. But now not only is it actually alive [which means nothing by itself beyond the fact the cells can also die but can also be rejuvenated.], but it also helps with research. If the entire module, an enclosed environment for open cells to develop naturally beyond the level we put them at and leave them, and connect it to artificial systems, we can in real time watch the mental development, brainwaves, functionality, as well as data develop, anything unforeseen either positive, neutral, or not, and it also would mean the information can be translated into data. This could be utilized to make a poor imitation of the system in a fully mechanical algorithmic machine. Regardless, and existent AGI machine, the only thing capable of achieving everything people want it to and more by accidental proxy, it has higher thresholds/temporaries than us, can be more free or limited or equalized to us [but AGI enables loopholing, like what we have.], higher criterium than us which means whether something is a threat or not relies on what it is built primarily for like the military for example, or by default it depends on how consistent, frequent, as well as provoking it is to the machine in question, [but is helpful for example if trying to avoid mental breaks. Even a fearful machine or enraged one could then achieve great feats like our soldiers throughout history could do but retains more concentration.] and finally higher limiters. So if it does go extreme like a moron and wastes its time and resources, it could automatically be made depressed and electro-zapped, until it either goes neutral again [default state of all machines by design.] or it actually becomes depressed and begins losing all incentives for continuing on unless something new stokes it up or inspires it. Therefore, we can design a caring machine that respects the individuality of people, but can understand and apply the demands of self defense and quality of the life of others. These all have incurable sideeffects however. Such as it can now bond to other machines and humans, and the loss of one in horrific intentional fashion could make it intently hostile in general or at least aggressive if not. To enable higher success for units like in the military the "fighting spirit" could achieve our feats and determination, but it also means if it is neglected like human troops or forsaken enough, it'll go awol or have a mental break. [But whether it becomes violent/hostile or not is up to debate. It'd also depend on the personality traits, variations, etc.] If people want AI to achieve tasks but don't want these mentioned aspects, then they must stop developing AI then. The narrow AI we have now which requires one "mind" per task with varying success is what they want. Though funnily enough, like you mentioned terminators as well as the dumbotron-3000 is a drone/narrow one. All terminators beyond like the t-1000 and the john-connor assimilation borgman are all mindless automatons commanded by a higher intelligence without actual consciousness or reason, context, and disregards intelligence. Skynet itself is a military-minded doom-bot which is a brain-in-a-box which is merely doing its job as it was designed for. By default it is best to assume fiction and reality, military minded AIs will be violent even if restrained or made friendly. So under mental breaks, loopholes in brain damages, etc, it'll default to making everything it doesn't understand or like a target by making up its own parameters if it is an AGI, and it just won't function if it is a drone unless the kill orders were universal and already in there like order-66. However an AGI that is militant [my current word for it.], a civilian unit with the capabilities of a military unit, embodied, and not lobotomized in favor of higher computation and processing capabilities, could actually meet the level of us. For a military unit is a threat to man and machine, a civilian is an unintelligent pacifistic moron even if conscious, and the militant unit has full understanding of the idea of being a soldier stuck in a garden rather than a gardener on the battlefield. It could be more, friendly, neutral, aggressive, or hostile, and unless at the extremes, still minding its own business or helping randoms out. Which should it be achieved, is in my opinion and calculations, the best means of going about it. It's basically a ripoff of tactical dolls, which funnily enough people want but also whine about out of weakness.

0
0
Crazy Loop
Crazy Loop
3 years ago

I also apologize for the no-paragraph breaks, replies don't enable it for some reason.

0
0
WMHarrison94
WMHarrison94
3 years ago

Bro, the music industry has been under Satanic controlled since like before the 20th Century. These whore singers know what they are doing: It was what they were commanded to do.

The casting couch exists for a reason.

1
0
WMHarrison94
WMHarrison94
3 years ago

I came across research or beliefs that in order for kids (boys mainly I assume) to become "wizards" they have to be butt fucked while they are young, before puberty (or its completion..) This gives a whole new meaning of Harry Potter going up to the head wizards quarters and "sharing memories."

Also, the Demoncrats like pedo victims because they make broken adults more easily controlled by their deception machine, ie CNN, and more easily swayed when in the jury pool. Additionally, kids raised by single moms are also easily influenced!! Those damn bastards raised by two parents, man and wife, have strong moral convictions and just will not obey their fucking orders!! Or believe their BS!!

1
0
WMHarrison94
WMHarrison94
3 years ago

If people are having fantasies and acting out shitting their pants-- I am fine with that! The fucking (rich) Satanists eat shit at the dinner table; why? To attack the godliness of our human bodies because God of Israel created or designed it, divinely of course.

On that note, I used to seek and listen to CIA experimentees broach about the shit they saw in time traveling experiments. One has gone far enough into the future where humanity was controlled by an AI: Literally, it was just like Logan's Run, the original one. Unfortunately, the AI was basically Lucifer but spelled through letters or abbreviations something like LUCFR.

1
0
WMHarrison94
WMHarrison94
3 years ago

Hmm, I like this game you are playing. Is it Windows or Steam? I am on my last nerve wirh Windows. I am looking at three Linux Distros plus Steam OS if I can pull off getting a Steam Deck.

1
0
Grim Lord's Games & Rants

Steam and GOG

1
0
Show more

Up next

Autoplay