Reinforcement learning is the primary technology investigated in the concept of Westworld.
Intelligence to super intelligence ?
The arrival of Intelligence explosion as per Nick Bostrom's arguments.
Superintelligence, Goals matching with our (human) goals .
The Narrative which is a primary part of the intelligent amusement park, placed
with intelligent machines ( Hosts) . Humans visit the park as guests.
Recalling of host for technical realignment or due to other malfunctions,
disrupts the narratives.
Emulations (intelligent machines ) have been misbehavingbecause of the reveries ( code).
A code emerged in the program as a thought, build or generated by the stack of oldest build codes,
forming the sense of memories.
Reveries accessing builds(intelligent hosts),accesses the scripts from previous data memories.
Reveries, change of courses in scripts. Scripts that were designed to be followed by the emulation .
The emulations are suppose to stay within their loops, stick to their scripts, with minor improvisations.
(Script are laid down behaviour or pattern of the intelligent host to follow in the park, act accordingly.)
Control cognition, emotional effect , accent of emulations.
The monitoring or assessment sessions are known as dreams.
The dreams can determine the emulation’s life. If found any anomaly, the
Emulations are de-comissioned.
Interrogative question asked to the hosts to begin with :
Have you ever questioned the nature of your reality ? Has anyone around you ?
Emulations could access previous configuration. Every emulation has a drive, final drive , iternary, drive. Drives can be understood as goals and values maybe ?
Drives for eg.
Animal farming
Look after my wife ,
These are drives/goals of a very miniscule impact, if goes wrong.
Final drive ;
Must protect daughter Dolores. ( Final Goal )
Not to harm or kill and guest in the park.( Crucial final goals )
The goals (drives ) of the maker were different from the goals of the emulation.
The emulation derived his own goal , form reveries and memories/fragments of prior
builds.
How were the reveries allowing him to access them ?
“ These violent delights have violent ends “
By most mechanical and dirty hands “
Dolores oldest emulation of the system.
Few emulations begins “ sensing “
The sense of touch. ( the mosquito bite, the pain from the previous gun shot)
The emulations always try to talk to each other. Its way of error- correct.
Make themselves more human. It when they talk to each other its a way of practicing.
Primaries - perception, emotional acquity.
Dreams for emulations are just memories.
How would the emulations feel if they remembered what the guest would do to them.
They are given with the concept of dreams- specifically nightmares.
Why ?
Just in case somebody forgets to wipe them out (memories) at the end of the maintenance session .
What’s the effect of the constant violence on the emulations? Witnessing them constantly ?
Emulation : Personal questions are an ingratiating scheme .
Holding a grudge against the ones who had been killing him from a long time , based on the narrative.
Back stories anchor the host , its their corner stone. The rest of their identity is built around it layer by layer .
Someday soon “ - always non specific
If the narratives , actors/roles aren’t in line, the narrative stays stuck :got caught in a loop
Wyatt become the new goal or drive.
The sense of smell, pain, fear, love, touch are these activated stimuli in the emulations?
If yes then what makes them different from us humans ? To retain their past experiences/ memories , build on the past ? Project the future ? Stick the memories with their goals ?
Also , the hosts dont understand or make meanings out of the arrival of the guests, which they refer them as new comers.
Intersting point: S1E3 : 32:47
The cleaner covers the emulation with a cloth , given reason, perhaps the cleaner didn’t mind the host to feel cold or ashamed ? Wanted to cover his modesty .
Emulations doesn’t get cold, doesn’t feel ashamed , doesnt feel a solitary thing that we haven’t told it to.
Issues/anomalies : memory recalls from previous builds, hearing voices, talking to someone
Create consciousness . Pyramid ( Memory, improvisation, self interest , topmost )
Low level agencies -reveries- frame agent
Theory of consciousness - Bicameral mind
Arnold built a cognition in which the host heard their programming as an inner monologue.
With hopes that in time, their own voice would take over,. Its a way to bootstrap consciousness.
Evolution forged the entirety of sentient life on this planet using only one tool - A mistake.
Adaption from a scripted dialogue on love.
The game /finding the centre of the maze is the reward.
New goal- I want to be free. Created with hints from trainer and previous motivations.
Create a new belief system in the world of host. A new belief system , of gods ? Of truth ? Of lie ?
Attribute matrix. Personality on a 20-point scale
Bulk apperception - overall intelligence.
Management position - smart.
Modifications to old builds.
Issueing modification .
Using old bicameral control system to reprogram the woodcutter ( old host ) .
Older models have receivers to accept modifications.
Changes to prime directives ( values, goals )
Goals assigned by the maker vs the incremental goals
A voice , are these the incremental goals ? I don’t think so .
“ They don’t look like anything to me “ a left motif in terms of cinematic theory ,
technically an interesting output by the machine which doesn't understand something
which is not programmed in them to be understood.
The difference in memories. The humans recall memories vaguely, where’s intelligent beings recall them perfectly asiagos it is, they relive them.
Bernard ,superintelligent sentient, asking the nature of his feelings, are they real, things which he has experienced ? What is real for him what is not ?
Suffering makes Bernard lifelike .
A machine will ask the nature of their experience, they will ask if they are real ?
The self is a fiction, for host and humans alike.
Life like “ but not alive “ asks Bernard .
Pain only exists in the mind, its only imagined.
So what is the difference between the machines pain and a human pain ?
We cannot define consciousness , because consciousness doesn’t exist.
Humans fancy that there is something special about the way we perceieve the world
And yet we loops as tights as the hosts ( machine ) do, seldom questioning our choices ,
Content for the most part to be told what to do next .
The question of ethics, the rules of not hurting someone, when it is broken, how do the
Machines feel about them ? Was it done for personal reasons or told to by someone to do it ?
In both cases what could be the scenarios, when they realise that they hurt someone.
The machines goals are based on memories he has gathered from past builds, doesn’t it the same with humans ?
Our goals are based on our past experiences ? how will , and what will be the differences in goals for the machines and the Human s?
Gap - sudden removal of memories from a lived experience of a machine, when some others are also a part of the narrative or remembers the
Narrative , sudden removal or change in the narrative will cause rough edges in the psyche or mental well being of the machine.
Or suspicion that he is not a human .
Interesting Aretfacts
User interface to optimise the parameters in Maeve( host, machine intelligent sentient )
Comments