Author Topic: ARTUR Exmachina  (Read 10576 times)

Offline Pocky

  • bit
  • Posts: 6
    • View Profile
ARTUR Exmachina
« on: April 19, 2018, 06:53:01 PM »
The ARTUR project and discussion Paul is working on makes me think of Exmachina. Curiosity was a huge element that drove the AI.

Offline Paul

  • Administrator
  • double
  • *****
  • Posts: 3499
  • Developer
    • View Profile
    • PaulsCode.Com
Re: ARTUR Exmachina
« Reply #1 on: April 20, 2018, 09:29:14 AM »
I'm targeting more of an insect-level intelligence (which I believe is achievable with current technology).  But you correct that curiosity is an absolutely essential element for any autonomous agent.  The agent must have some core motivation to try something different in a recognized context (or try anything at all in a novel context).

Deep Mind (and similar RL implementations) solve this with a diminishing random chance of non-ideal behavior in any given context.  This definitely works, but it has its own problems.  Training requires huge numbers simulations where the agent essentially brute-forces the problem until it reaches a policy that is deemed to be acceptable.  As we recently saw with the Uber self driving car accident, this may not be good enough in situations where human safety is a concern.

My current thinking is that the key to understanding curiosity (which is critical to effectively implement reinforcement learning) requires first understanding emotions.  I recently read an interesting paper by Friedemann Pulvermuller which goes into the relevant biology in nice detail with excellent visualizations.  It appears that emotional flavoring of sensory motor mechanisms is at the core of how the brain establishes a semantic grounding (and thus is a critical part of choosing what actions to take in a given context).
« Last Edit: April 20, 2018, 09:36:34 AM by Paul »
Device: Samsung Galaxy Nexus i515
CPU: TI OMAP4460, 1.2 GHz (dual core, ARM Cortex-A9)
GPU: PowerVR SGX540, 307 MHz
RAM: 1 GB
Resolution: 720 x 1280
Rom: omni-4.4.4-20141014-toro-FML KitKat 4.4.4, rooted

Device: Eee PC 1015PEM
CPU: Intel Atom N550, 1.5 GHz (dual core, x86)
GPU: Intel GMA 3150, 200 MHz (dual core)
RAM: 2GB
Resolution: 1024 x 600
Rom: android-x86-4.3-20130725 Jelly Bean 4.3, rooted

Offline Pocky

  • bit
  • Posts: 6
    • View Profile
Re: ARTUR Exmachina
« Reply #2 on: April 20, 2018, 02:16:01 PM »
On that topic of emotion, which the paper didn't really go into, is that emotion is an evolutionary trait to punish and reward survival. To "program" emotion i guess you need to program punishment and reward system in a sense

Offline Paul

  • Administrator
  • double
  • *****
  • Posts: 3499
  • Developer
    • View Profile
    • PaulsCode.Com
Re: ARTUR Exmachina
« Reply #3 on: April 20, 2018, 03:08:13 PM »
The significations of that paper here, I think, is taking it in context of the amygdala:



The relating that to Figure 2 from the paper:



The amygdala is outputting to the prefrontal cortex (PF), which is a hub in the distributed circuit for semantics, which are grounded in sensory motor activation.  This means the sensory input, and all semantic representations derived from it, and generated motor outputs, are all flavored by emotional context.  The obvious conclusion is that emotional context is a critical part of the sensory-motor circuit and action decisions.

I interpret this to mean that in order to understand concepts like "curiosity" that are important for more efficient reinforcement learning strategies, the above circuit (with its emotional input) need to be understood and part of the model.
Device: Samsung Galaxy Nexus i515
CPU: TI OMAP4460, 1.2 GHz (dual core, ARM Cortex-A9)
GPU: PowerVR SGX540, 307 MHz
RAM: 1 GB
Resolution: 720 x 1280
Rom: omni-4.4.4-20141014-toro-FML KitKat 4.4.4, rooted

Device: Eee PC 1015PEM
CPU: Intel Atom N550, 1.5 GHz (dual core, x86)
GPU: Intel GMA 3150, 200 MHz (dual core)
RAM: 2GB
Resolution: 1024 x 600
Rom: android-x86-4.3-20130725 Jelly Bean 4.3, rooted

Offline Pocky

  • bit
  • Posts: 6
    • View Profile
Re: ARTUR Exmachina
« Reply #4 on: April 20, 2018, 04:13:25 PM »
So, you would have to program and categorize stimulus and the reaction to said stimulus based on the types of sensory input is how i imagine the approach would work in the AI sense, even on the insect level. So it almost sounds like you would need to separate the AI's amygdala into subparts and have them work together (reference, consult, pass a logic check list) to determine a reaction/action?

Offline Paul

  • Administrator
  • double
  • *****
  • Posts: 3499
  • Developer
    • View Profile
    • PaulsCode.Com
Re: ARTUR Exmachina
« Reply #5 on: April 20, 2018, 04:17:08 PM »
Yes, that is exactly what I am thinking.  It is interesting that the path to intelligent robots is could end up being backwards from how it is imagined in Sci-fi (where emotions tend to be absent or developed much later)
Device: Samsung Galaxy Nexus i515
CPU: TI OMAP4460, 1.2 GHz (dual core, ARM Cortex-A9)
GPU: PowerVR SGX540, 307 MHz
RAM: 1 GB
Resolution: 720 x 1280
Rom: omni-4.4.4-20141014-toro-FML KitKat 4.4.4, rooted

Device: Eee PC 1015PEM
CPU: Intel Atom N550, 1.5 GHz (dual core, x86)
GPU: Intel GMA 3150, 200 MHz (dual core)
RAM: 2GB
Resolution: 1024 x 600
Rom: android-x86-4.3-20130725 Jelly Bean 4.3, rooted

Offline Pocky

  • bit
  • Posts: 6
    • View Profile
Re: ARTUR Exmachina
« Reply #6 on: April 20, 2018, 04:32:10 PM »
Well, primal emotion is different from what people would call human and Hollywood Sci-fi emotion. The latter of course has to be developed much later. To design a true AI i would think closely modeling after biological basic evolution makes the most sense, not modeling after human. I mean, in a way, you're rebuilding life and consciousness, sorta

Offline Paul

  • Administrator
  • double
  • *****
  • Posts: 3499
  • Developer
    • View Profile
    • PaulsCode.Com
Re: ARTUR Exmachina
« Reply #7 on: April 21, 2018, 02:48:53 AM »
Yes, definitely would not need anything like the elaborate range of emotions that humans have.  However, there are a couple which I believe are a basic part of intelligent behaviors:

Fear.  This emotion ranges from being startled to scared to panic.  It motivates the most basic behaviors like capturing attention, freezing, fleeing, and aggression when cornered.

Curiosity.  This emotion ranges based on the novelty of a given context.  Some event first captures a creature's attention, and any threat has either passed or is below some threshold.  The creature is motivated into an investigatory behavior to accumulate more information about the event.

Boredom.  This emotion ranges from malaise to frustration to exasperation.  While this one is a more complex emotion than an insect probably has, I think it is necessary in a non-biological intelligence to replace other drives like thirst/hunger/sex which motivate exploratory behaviors.

Device: Samsung Galaxy Nexus i515
CPU: TI OMAP4460, 1.2 GHz (dual core, ARM Cortex-A9)
GPU: PowerVR SGX540, 307 MHz
RAM: 1 GB
Resolution: 720 x 1280
Rom: omni-4.4.4-20141014-toro-FML KitKat 4.4.4, rooted

Device: Eee PC 1015PEM
CPU: Intel Atom N550, 1.5 GHz (dual core, x86)
GPU: Intel GMA 3150, 200 MHz (dual core)
RAM: 2GB
Resolution: 1024 x 600
Rom: android-x86-4.3-20130725 Jelly Bean 4.3, rooted

Offline Pocky

  • bit
  • Posts: 6
    • View Profile
Re: ARTUR Exmachina
« Reply #8 on: April 21, 2018, 06:19:42 PM »
Putting it that way, you would need a lot of micro-decision branches to define the main categories of basic emotions in order to determine action. Now, how would you define fear to ARTUR? Will there be punishment and reward system?

Offline Paul

  • Administrator
  • double
  • *****
  • Posts: 3499
  • Developer
    • View Profile
    • PaulsCode.Com
Re: ARTUR Exmachina
« Reply #9 on: April 22, 2018, 10:20:06 PM »
The system will need to be built around generic rules, such that certain types of behaviors emerge as a result.  My current thoughts are:

Any given state in the system should be remembered along with the emotional context of that state each time it is encountered.  This emotional context should influence the actions taken in future semantically similar states.  Actions define the possible branches from a given context.  Whatever the particular combination of emotions the creature is experiencing should bias some actions over others, allowing the creature to try a variety of different actions in reoccurrences of a state.

Each emotion would have a range, and would diminish via logarithmic decay.

The level of fear in the creature would be increased based on:
  - Unexpected or wrongly predicted state
  - Level of future punishment predicted by the current state (due to past memories)
  - Actual punishment

When the level of fear is above a certain threshold, the system would narrow potential actions to those which are least flavored by previous fears, preferring those with the least predicted punishment.  This should result in the creature seeking a safe, familiar, predictable state when it is scared (running for cover, for example).  If there are no options remaining, actions which involve the greatest number of motor commands would be chosen (aggressively lashing out, for example)

The level of curiosity in the creature would be increased based on:
  - Unexpected or wrongly predicted state
  - Level of future reward predicted by the current state (due to past memories)
  - Actual reward

When the level of curiosity compared to the level of fear reaches a certain threshold, the system would narrow down potential actions to those which have the least amount of emotional flavoring, preferring those with the highest predicted reward.  This should result in the creature seeking out unfamiliar, new states, allowing it to form a better model of the environment, as well as to seek out rewards.  This should also result in a back-and-forth with fear behaviors (cautiously investigating after a threat has passed, for example)

The level of boredom in the creature would be the inverse of a logarithmic decay (opposite of the other two emotions).  Boredom would increase over time, and be decreased based on:
  - Unexpected or wrongly predicted state

When the level of boredom compared to the level of fear and curiosity reaches a certain threshold, the system would narrow down potential actions to those which have least frequently been tried in the current state.  This should result in a back-and-forth with curiosity behaviors, allowing the creature to seek out new stimuli (when it finds itself "stuck in a rut", for example).  This should also prevent the creature from finding a safe, familiar place, stopping, and never moving from there (since there isn't anything else like hunger or thirst to motivate exploration).
« Last Edit: April 22, 2018, 10:23:18 PM by Paul »
Device: Samsung Galaxy Nexus i515
CPU: TI OMAP4460, 1.2 GHz (dual core, ARM Cortex-A9)
GPU: PowerVR SGX540, 307 MHz
RAM: 1 GB
Resolution: 720 x 1280
Rom: omni-4.4.4-20141014-toro-FML KitKat 4.4.4, rooted

Device: Eee PC 1015PEM
CPU: Intel Atom N550, 1.5 GHz (dual core, x86)
GPU: Intel GMA 3150, 200 MHz (dual core)
RAM: 2GB
Resolution: 1024 x 600
Rom: android-x86-4.3-20130725 Jelly Bean 4.3, rooted

Offline Pocky

  • bit
  • Posts: 6
    • View Profile
Re: ARTUR Exmachina
« Reply #10 on: April 23, 2018, 04:50:29 PM »
You're basically rebuilding evolution in smaller scale haha. You mention any given state in the system should be remembered, this reminds me of many evolutionary instinct topics. Instinct is built based on remembered states, emotional contexts and consequences as well, just more solidified over time, like the instinct of all mammals to care for their young. But at the same time, some traumatic recurrent outliers events can be forgotten, and when they can't be forgotten there are issues that happen. So i guess the question is, how do you determine what and when a certain history should be "forgotten" so it doesn't badly influence your AI.
Also, your project may have an interesting twist if you built an actual insect robot at the same time  ;D

Offline Paul

  • Administrator
  • double
  • *****
  • Posts: 3499
  • Developer
    • View Profile
    • PaulsCode.Com
Re: ARTUR Exmachina
« Reply #11 on: April 24, 2018, 09:30:36 AM »
Also, your project may have an interesting twist if you built an actual insect robot at the same time  ;D

Definitely agree.  While the seed AI would probably not be intelligent enough to do very much interesting with a robot, the hope is that after many iterations through the recursion engine, it could be used to drive a toy robot.
Device: Samsung Galaxy Nexus i515
CPU: TI OMAP4460, 1.2 GHz (dual core, ARM Cortex-A9)
GPU: PowerVR SGX540, 307 MHz
RAM: 1 GB
Resolution: 720 x 1280
Rom: omni-4.4.4-20141014-toro-FML KitKat 4.4.4, rooted

Device: Eee PC 1015PEM
CPU: Intel Atom N550, 1.5 GHz (dual core, x86)
GPU: Intel GMA 3150, 200 MHz (dual core)
RAM: 2GB
Resolution: 1024 x 600
Rom: android-x86-4.3-20130725 Jelly Bean 4.3, rooted