Skip to content

Commit

Permalink
Feat/system/show graph (#555)
Browse files Browse the repository at this point in the history
* -

* -

* -

* • Library:
  - docs:  "Base class" -> "Core class"

* • CONVENTIONS:
  - added section on organization of repo and base vs. core classes

* • System
  - fixed bug in show_graph producing empty image for graphs with just one Mechanism

* • System
  - show_graph():
    fixed bug producing empty image for graphs with just one Mechanism
    added auto-recurrent projections

* • System
  - show_graph():
    fixed bug producing empty image for graphs with just one Mechanism
    added auto-recurrent projections

* • System
  - show_graph(): shows dimensions for Mechanisms and Projections

* • System
  - show_graph(): shows dimensions for Mechanisms and Projections

* • System, Process
  - minor modes to enabling of learning to prevent crash when learning is specified in constructor
    but no learnable elements are included in Process

* • System, Process
  - minor modes to enabling of learning to prevent crash when learning is specified in constructor
    but no learnable elements are included in Process
  • Loading branch information
jdcpni authored Nov 30, 2017
1 parent 4f01d6b commit 7b097af
Show file tree
Hide file tree
Showing 19 changed files with 274 additions and 260 deletions.
60 changes: 58 additions & 2 deletions CONVENTIONS.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,62 @@

# PsyNeuLink Coding and Documentation Conventions

# PsyNeuLink Organization, Coding, and Documentation Conventions

## REPOSITORY ORGANIZATION:

### Core:
Made up of two types of classes:
- *abstract base classes* (italicized) - cannot be instantiated.
- **core classes** (bold) - most basic (abstract) level of objects that can be instantiated.

#### Components
"Building blocks"
- *Mechanism*
- *ProcessingMechanism*
- **TransferMechanism**
- **IntegratorMechanism**
- **ObjectiveMechanism**
- *AdaptiveMechanism*
- **LearningMechanism**
- **ControlMechanism**
- **GatingMechanism**
- *Projection*
- *PathwayProjection*
- **MappingProjection**
- *ModulatoryProjection*
- **LearningProjection**
- **ControlProjection**
- **GatingProjection**
- *State*
- **InputState**
- **ParameterState**
- **OutputState**
- *ModulatorySignal*
- **LearningSignal**
- **ControlSignal**
- **GatingSignal**
- *Function*
- *TransferFunction*
- *CombinationFunction*
- *IntegratorFunction*
- *DistributionFunction*
- *LearningFunction*

#### Composisitons
Objects that compose building blocks and control their execution.
- *Composition*
- **System**
- **Process**

#### Scheduler
Objects used by Compositions to control the execution of Components and Compositions.
- **Scheduler**
- **Condition**

### Library
Extensions of Core objects
- *Components:* classes derived from Core objects
- *Compositions:* models
- *Models:* published, implemented models

### NAMING:

Expand Down
1 change: 1 addition & 0 deletions Scripts/Examples/EVC-Gratton.py
Original file line number Diff line number Diff line change
Expand Up @@ -152,6 +152,7 @@ def test_outcome_function(**kwargs):
# Show characteristics of system:
mySystem.show()
mySystem.controller.show()
mySystem.show_graph(show_control=pnl.ALL, show_dimensions=pnl.ALL)

# configure EVC components
mySystem.controller.control_signals[0].intensity_cost_function = pnl.Exponential(rate=0.8046).function
Expand Down
3 changes: 1 addition & 2 deletions Scripts/Examples/Multilayer-Learning.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ def show_target(system):
)

mySystem.reportOutputPref = True
mySystem.show_graph(show_learning=pnl.ALL)
mySystem.show_graph(show_learning=pnl.ALL, show_dimensions=pnl.ALL)
# mySystem.show_graph()

stim_list = {Input_Layer: [[-1, 30]]}
Expand All @@ -119,4 +119,3 @@ def show_target(system):
termination_processing={pnl.TimeScale.TRIAL: pnl.AfterNCalls(Output_Layer, 1)}
)

mySystem.show_graph(show_learning=pnl.ALL)
19 changes: 17 additions & 2 deletions Scripts/Scratch Pad.py
Original file line number Diff line number Diff line change
Expand Up @@ -716,7 +716,6 @@ def __init__(self, error_value):

#endregion


# region TEST MODULATORY SPECS
# print ("TEST MODULATORY SPECS")
#
Expand Down Expand Up @@ -811,7 +810,6 @@ def __init__(self, error_value):

#endregion


#region TEST DOCUMENTATION
# print ("TEST DOCUMENTATION")

Expand Down Expand Up @@ -881,6 +879,23 @@ def __init__(self, error_value):

#endregion

#region TEST System Graph with AutoAssociativeMechanism
print("TEST System Graph with AutoAssociativeMechanism")

a = pnl.DDM(name='MY DDM')
# a = pnl.RecurrentTransferMechanism(name='Autoassociator')
p = pnl.Process(pathway=[a],
learning=pnl.ENABLED
)
s = pnl.System(processes=[p])
s.show_graph(show_learning=pnl.ALL,
show_dimensions=pnl.ALL
)


#endregion


#region TEST INSTANTATION OF Cyclic and Acyclic Systems @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
#
#
Expand Down
2 changes: 1 addition & 1 deletion docs/source/ControlMechanisms.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
ControlMechanisms
=================

**Base class**:
**Core class**:

* `ControlMechanism`

Expand Down
2 changes: 1 addition & 1 deletion docs/source/ControlProjections.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Control Projections
===================

**Base class**:
**Core class**:

* `ControlProjection`

Expand Down
2 changes: 1 addition & 1 deletion docs/source/GatingMechanisms.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
GatingMechanisms
================

**Base class**:
**Core class**:

* `GatingMechanism`

Expand Down
2 changes: 1 addition & 1 deletion docs/source/GatingProjections.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
GatingProjections
=================

**Base class**:
**Core class**:

* `GatingProjection`

Expand Down
2 changes: 1 addition & 1 deletion docs/source/IntegratorMechanisms.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
IntegratorMechanisms
====================

**Base class**:
**Core class**:

* `IntegratorMechanism`

Expand Down
2 changes: 1 addition & 1 deletion docs/source/LearningMechanisms.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Learning Mechanisms
===================

**Base class**:
**Core class**:

* `LearningMechanism`

Expand Down
2 changes: 1 addition & 1 deletion docs/source/LearningProjections.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Learning Projections
====================

**Base class**:
**Core class**:

* `LearningProjection`

Expand Down
2 changes: 1 addition & 1 deletion docs/source/ObjectiveMechanisms.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Objective Mechanisms
====================

**Base class**:
**Core class**:

* `ObjectiveMechanism`

Expand Down
2 changes: 1 addition & 1 deletion docs/source/TransferMechanisms.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Transfer Mechanisms
===================

**Base class**:
**Core class**:

* `TransferMechanism`

Expand Down
26 changes: 1 addition & 25 deletions psyneulink/components/component.py
Original file line number Diff line number Diff line change
Expand Up @@ -1765,7 +1765,6 @@ def _instantiate_defaults(self,
# so that latter are evaluated in context of former
for param_name, param_value in sorted(default_set.items()):

# MODIFIED 11/30/16 NEW:
# FUNCTION class has changed, so replace rather than update FUNCTION_PARAMS
if param_name is FUNCTION:
try:
Expand All @@ -1778,27 +1777,22 @@ def _instantiate_defaults(self,
except UnboundLocalError:
pass
# FIX: MAY NEED TO ALSO ALLOW assign_default_FUNCTION_PARAMS FOR COMMAND_LINE IN CONTEXT
# MODIFIED 11/30/16 END

if param_name is FUNCTION_PARAMS and not self.assign_default_FUNCTION_PARAMS:
continue

# MODIFIED 11/29/16 NEW:
# Don't replace requested entry with default
if param_name in request_set:
continue
# MODIFIED 11/29/16 END

# Add to request_set any entries it is missing fron the default_set
request_set.setdefault(param_name, param_value)
# Update any values in a dict
if isinstance(param_value, dict):
for dict_entry_name, dict_entry_value in param_value.items():
# MODIFIED 11/29/16 NEW:
# Don't replace requested entries
if dict_entry_name in request_set[param_name]:
continue
# MODIFIED 11/29/16 END
request_set[param_name].setdefault(dict_entry_name, dict_entry_value)

# VALIDATE PARAMS
Expand All @@ -1813,11 +1807,6 @@ def _instantiate_defaults(self,
# any override of _validate_params, which (should not, but) may process params
# before calling super()._validate_params
for param_name, param_value in request_set.items():
# # MODIFIED 11/25/17 OLD:
# if isinstance(param_value, tuple):
# param_value = self._get_param_value_from_tuple(param_value)
# request_set[param_name] = param_value
# MODIFIED 11/25/17 NEW:
if isinstance(param_value, tuple):
param_value = self._get_param_value_from_tuple(param_value)
elif isinstance(param_value, (str, Component, type)):
Expand All @@ -1826,7 +1815,6 @@ def _instantiate_defaults(self,
else:
continue
request_set[param_name] = param_value
# MODIFIED 11/25/17 END:

try:
self._validate_params(variable=variable,
Expand Down Expand Up @@ -1887,11 +1875,7 @@ def _assign_params(self, request_set:tc.optional(dict)=None, context=None):

self._instantiate_defaults(request_set=request_set,
target_set=validated_set,
# # MODIFIED 4/14/17 OLD:
# assign_missing=False,
# MODIFIED 4/14/17 NEW:
assign_missing=False,
# MODIFIED 4/14/17 END
assign_missing=False,
context=context)

self.paramInstanceDefaults.update(validated_set)
Expand Down Expand Up @@ -2086,7 +2070,6 @@ def _validate_params(self, request_set, target_set=None, context=None):
raise ComponentError("{0} is not a valid parameter for {1}".format(param_name, self.__class__.__name__))

# The value of the param is None in paramClassDefaults: suppress type checking
# DOCUMENT:
# IMPLEMENTATION NOTE: this can be used for params with multiple possible types,
# until type lists are implemented (see below)
if self.paramClassDefaults[param_name] is None or self.paramClassDefaults[param_name] is NotImplemented:
Expand All @@ -2100,18 +2083,14 @@ def _validate_params(self, request_set, target_set=None, context=None):
# If the value in paramClassDefault is a type, check if param value is an instance of it
if inspect.isclass(self.paramClassDefaults[param_name]):
if isinstance(param_value, self.paramClassDefaults[param_name]):
# MODIFIED 2/14/17 NEW:
target_set[param_name] = param_value
# MODIFIED 2/14/17 END
continue
# If the value is a Function class, allow any instance of Function class
from psyneulink.components.functions.function import Function_Base
if issubclass(self.paramClassDefaults[param_name], Function_Base):
# if isinstance(param_value, (function_type, Function_Base)): <- would allow function of any kind
if isinstance(param_value, Function_Base):
# MODIFIED 2/14/17 NEW:
target_set[param_name] = param_value
# MODIFIED 2/14/17 END
continue

# If the value in paramClassDefault is an object, check if param value is the corresponding class
Expand Down Expand Up @@ -2247,7 +2226,6 @@ def _validate_params(self, request_set, target_set=None, context=None):
raise ComponentError("Value of {} param for {} ({}) is not compatible with {}".
format(param_name, self.name, param_value, type_name))

# MODIFIED 11/25/17 NEW:
def _get_param_value_for_modulatory_spec(self, param_name, param_value):
from psyneulink.globals.keywords import MODULATORY_SPEC_KEYWORDS
if isinstance(param_value, str):
Expand All @@ -2273,8 +2251,6 @@ def _get_param_value_for_modulatory_spec(self, param_name, param_value):
raise ComponentError("PROGRAM ERROR: Could not get default value for {} of {} (to replace spec as {})".
format(param_name, self.name, param_value))

# MODIFIED 11/25/17 END:

def _get_param_value_from_tuple(self, param_spec):
"""Returns param value (first item) of a (value, projection) tuple;
"""
Expand Down
31 changes: 21 additions & 10 deletions psyneulink/components/process.py
Original file line number Diff line number Diff line change
Expand Up @@ -1029,10 +1029,12 @@ def _instantiate_pathway(self, context):
self._instantiate__deferred_inits(context=context)

if self.learning:
self._check_for_target_mechanisms()
if self._target_mechs:
self._instantiate_target_input(context=context)
self._learning_enabled = True
if self._check_for_target_mechanisms():
if self._target_mechs:
self._instantiate_target_input(context=context)
self._learning_enabled = True
else:
self._learning_enabled = False
else:
self._learning_enabled = False

Expand Down Expand Up @@ -1934,6 +1936,8 @@ def _check_for_target_mechanisms(self):
Identify TARGET Mechanisms and assign to self.target_mechanisms,
assign self to each TARGET Mechanism
and report assignment if verbose
Returns True of TARGET Mechanisms are found and/or assigned, else False
"""

from psyneulink.components.mechanisms.processing.objectivemechanism import ObjectiveMechanism
Expand Down Expand Up @@ -1970,8 +1974,18 @@ def trace_learning_objective_mechanism_projections(mech):
if (isinstance(object_item, ObjectiveMechanism) and
object_item._learning_role is TARGET))

if not target_mechs:
if target_mechs:

# self.target_mechanisms = target_mechs
self._target_mechs = target_mechs
if self.prefs.verbosePref:
print("\'{}\' assigned as TARGET Mechanism(s) for \'{}\'".
format([mech.name for mech in self._target_mechs], self.name))
return True


# No target_mechs already specified, so get from learning_mechanism
elif self._learning_mechs:
last_learning_mech = self._learning_mechs[0]

# Trace projections to first learning ObjectiveMechanism, which is for the last mechanism in the process,
Expand Down Expand Up @@ -2005,13 +2019,10 @@ def trace_learning_objective_mechanism_projections(mech):

raise ProcessError("PROGRAM ERROR: {} has a learning specification ({}) "
"but no TARGET ObjectiveMechanism".format(self.name, self.learning))
return True

else:
# self.target_mechanisms = target_mechs
self._target_mechs = target_mechs
if self.prefs.verbosePref:
print("\'{}\' assigned as TARGET Mechanism(s) for \'{}\'".
format([mech.name for mech in self._target_mechs], self.name))
return False

def _instantiate_target_input(self, context=None):

Expand Down
Loading

0 comments on commit 7b097af

Please sign in to comment.