Perceptors

From Simspark
Jump to: navigation, search

Perceptors are the senses of an agent, allowing awareness of the agent's model state and the environment.

The server sends perceptor messages to agents, via the network protocol, for every cycle of the simulation.

Perceptor messages are sent via the network protocol. Details of each message type are shown in each section below.

There are both general perceptors that apply to all simulations, and soccer perceptors that are specific to the soccer simulation.

Contents

General Perceptors

The perceptors described in this subsection are available to all types of simulation. In other words they are not specific to the soccer environment.

GyroRate Perceptor

The gyro rate perceptor delivers information about the change in orientation of a body. The message contains the GYR identifier, the name of the body to which the gyro perceptor belongs and three rotation angles. These rotation angles describe the change rates in orientation of the body during the last cycle. In other words the current angular velocities along the three axes of freedom of the corresponding body in degrees per second. To keep track of the orientation of the body, the information to each gyro rate perceptor is sent every cycle.

TODO confirm the order in which these rotations are applied -- seems to be x, y then z

TODO confirm when exactly the value was sampled -- such as at the end of the 0.02 second time step

Nao has a gyro perceptor in the upper torso.

Message format (GYR (n <name>) (rt <x> <y> <z>))
<name> The name of the body part the gyro rate perceptor is attached to.
<x> <y> <z> The current angular velocities along the three axes of freedom of the corresponding body in degrees per second.
Example message (GYR (n torso) (rt 0.01 0.07 0.46))
Frequency Every cycle
Noise model None, however values are truncated to two decimal places, equating to a uniform error of up to 0.01 degrees.

TODO confirm whether these values are truncated or rounded rounded

HingeJoint Perceptor

Hinge Joint

A hinge joint perceptor receives information about the angle of the correponding single-axis hinge joint. It contains the identifier HJ, the name of the perceptor and the position angle of the axis in degrees. A zero angle corresponds to straightly aligned bodies. The position angle of each hinge joint perceptor is sent every cycle.

Each hinge joint has minimum and maximum limits on its angular position. This varies from hinge to hinge and depends upon the model being used.

Message format (HJ (n <name>) (ax <ax>))
<name> The name of the corresponding hinge joint.
<ax> The current position angle in degrees.
Example message (HJ (n laj3) (ax -1.02))
Frequency Every cycle
Noise Model None, however values are truncated to two decimal places, equating to a uniform error of up to 0.01 degrees.

TODO confirm whether these values are truncated or rounded rounded

This perceptor's dual is the HingeJoint Effector.

UniversalJoint Perceptor

Universal Joint

A universal joint perceptor receives information about the two angles of the correponding two axis universal joint. It contains the identifier UJ, the name of the perceptor and the position angles of the two axes. Zero degrees corresponds to straightly aligned bodies.

Message format (UJ (n <name>) (ax1 <ax1>) (ax2 <ax2>))
<name> The name of the corresponding universal joint.
<ax1> <ax2> The current position angles of the two axes in degrees.
Example message (UJ (n laj1 2) (ax1 -1.32) (ax2 2.00))
Frequency Every cycle
Noise model None, however values are truncated to two decimal places, equating to a uniform error of up to 0.01 degrees.

TODO confirm whether these values are truncated or rounded rounded

This perceptor's dual is the UniversalJoint Effector.

Touch Perceptor

This perceptor works like a bumper that is triggered if the agent part to which it is mounted collides with another simulation object. The perceptor always reports its own unique name. This allows the use of more than one TouchPerceptor per agent. Furthermore, the value 0 meaning no collision detected or 1 meaning collision detected is given.

This perceptor is currently (rcssserver3d 0.6.5) not used in any robot models in rcssserver3d.

Message format (TCH n <name> val <bit>)
Example message (TCH n bumper val 1)

ForceResistance Perceptor

This perceptor informs about the force that acts on a body. After the identifier FRP and the name of the body the perceptor message contains two vectors. The first vector describes the point of origin relative to the body itself and the second vector the resulting force on this point. The two vectors are just an approximation about the real applied force. The point of origin is calculated as weighted average of all contact points to which the force is applied, while the force vector represents the total force applied to all of these contact points. The information to a force resistance perceptor is just sent in case of a present collision of the corresponding body with another simulation object. If there is no force applied, the message of this perceptor is omitted.

Nao and Soccerbot both have two of these perceptors, located in the bottom of each foot and labeled lf and rf.

Message format (FRP (n <name>) (c <px> <py> <pz>) (f <fx> <fy> <fz>))
<name> The name of the body, to which the force resistance perceptor belongs.
<px> <py> <pz> The local coordinates of the origin of the applied force in meters.
<fx> <fy> <fz> The components of the force vector. The length of the force vector represents the given force in newton (kg m/s2).
Example message (FRP (n lf) (c -0.14 0.08 -0.05) (f 1.12 -0.26 13.07))
Frequency Every cycle, but only in case of a present collision.
Noise model None, however values are truncated to two decimal places, equating to a uniform error of up to 0.01 metres/newtons.

TODO confirm whether these values are truncated or rounded rounded

Accelerometer

This perceptor measures the proper acceleration it experiences relative to free fall. As a consequence an accelerometer at rest relative to the Earth's surface will indicate approximately 1g (9.81m/s^2) upwards. To obtain the acceleration due to motion with respect to the earth, this gravity offset should be subtracted.

Nao has an accelerometer in the upper torso.

Message format: (ACC (n <name>) (a <x> <y> <z>))
<name>: The name of the body containing the accelerometer.
<x> <y> <z>: The current acceleration along the three axes of freedom of the corresponding body in m/s2.
Example message: (ACC (n torso) (a 0.00 0.00 9.81))
Frequency: Every cycle
Noise model: None, however values are truncated to two decimal places, equating to a uniform error of up to 0.01 m/s2.

TODO confirm whether these values are truncated or rounded rounded




Soccer Perceptors

The following perceptors are soccer specific and only available in the soccer simulation. Remember that although SimSpark is primarily used by the RoboCup 3D simulation league, it is a generic simulator, suited to other purposes. rcssserver3d provides support for the following perceptors.

Vision Perceptors

The Vision perceptor delivers information about seen objects in the environment, where objects are either others players, the ball, field-lines or markers on the field. Currently there are 8 markers on the field: one at each corner point of the field and one at each goal post. With each visible object you get a vector described in spherical coordinates. In other words the distance together with the horizontal and latitudal angle to the center of a visible object relative to the orientation of the camera.

Polar Vision

Unlike the 2D soccer simulation, the vision system does not deliver object velocities. All distances and angles are given relative to the position and orientation of the camera.

Objects are not occluded by other objects at this point, although this may be implemented in future.

The noise parameters of the vision system are as follows:

  • A small calibration error is added to the camera position. For each axis the error is uniformly distributed between ±0.005m. The error is calculated once and remains constant for an agent's lifetime (note that new competition rules involve server restarts at half time, which will recalculate this calibration error).
  • Dynamic noise normally distributed around 0.0 for each of:
    • Distance error: σ² = 0.0965 (also, distance error is multiplied by distance/100)
    • Horizonal angle (φ) error: σ² = 0.1225
    • Vertical angle (θ) error: σ² = 0.1480

A vision message is started with See followed by a list of seen objects. While the ball and the markers on the field are simple objects described through just one position vector, a player is a more complex object and needs a tighter description as well as additional information like teamname. Therefore a player is described through the name of it's team, the player number and one or more position vectors to different body parts. Besides that, field-lines are perceived as two position vectors ("start" and "end" point), bounded by the resticted vision. Unlike markers on the field, field lines aren't labeled with identifiers.

In older server versions players are also described with just one position vector. In this case the message format to a player contains just one position vector to the center of the upper torso, without any additional body part information. Also field-lines aren't visible in older server versions (below 6.4).

Note that the data from vision perceptors is not reported for every simulation cycle. As of May 2012, they are reported every 3 cycles (0.06 sec).

Message format
(See +(<name> (pol <distance> <angle1> <angle2>))
    +(P (team <teamname>) (id <playerID>) +(<bodypart> (pol <distance> <angle1> <angle2>)))
    +(L (pol <distance> <angle1> <angle2>) (pol <distance> <angle1> <angle2>)))
<name> The name of the simple object as listed in table Simple visible objects.
<distance> The distance to the seen object.
<angle1> The horizontal angle (φ) to the visible object relative to the view direction in degrees and an accuracy of two digits. Positive angles are left of the vertical midline of the agent's vision, negative angles right.
<angle2> The vertical angle (θ) to the visible object relative to the view direction in degrees and an accuracy of two digits. Positive angles mean the seen object is above the horizontal midline of agent's vision, negative means below.
<teamname> The name of the team, to which the seen player belongs.
<playerID> The player number of the seen player.
<bodypart> The name of the body part.
Example message
(See (G2R (pol 17.55 -3.33 4.31)) 
     (G1R (pol 17.52 3.27 4.07)) 
     (F1R (pol 18.52 18.94 1.54)) 
     (F2R (pol 18.52 -18.91 1.52)) 
     (B (pol 8.51 -0.21 -0.17)) 
     (P (team teamRed) (id 1) 
        (head (pol 16.98 -0.21 3.19)) 
        (rlowerarm (pol 16.83 -0.06 2.80)) 
        (llowerarm (pol 16.86 -0.36 3.10)) 
        (rfoot (pol 17.00 0.29 1.68)) 
        (lfoot (pol 16.95 -0.51 1.32))) 
     (P (team teamBlue) (id 3) 
        (rlowerarm (pol 0.18 -33.55 -20.16)) 
        (llowerarm (pol 0.18 34.29 -19.80))))
     (L (pol 12.11 -40.77 -2.40) (pol 12.95 -37.76 -2.41)) 
     (L (pol 12.97 -37.56 -2.24) (pol 13.32 -32.98 -2.20))
Frequency Every third cycle (every 0.06 seconds)
Noise Model Calibration error (a fixed offset of around ±0.004m in each of x/y/z axes), Gaussian noise (as described above) and values are truncated to two decimal places, equating to a uniform error of up to 0.01.

TODO confirm whether these values are truncated or rounded rounded


Visible object Name Notes
Flags F1L, F1R, F2L, F2R The point at the base of the flag's pole is seen.
Goalposts G1L, G1R, G2L, G2R The point at the top of the goal's post is seen.
Ball B TODO can someone confirm that the center of the ball is observed please?
Simple visible objects.

Read about the Field Dimensions and Layout to understand where the fixed landmarks are placed on the field.


Visible body part Identifier
Head head
Right lower arm rlowerarm
Left lower arm llowerarm
Right foot rfoot
Left foot lfoot
Visible body parts of Nao.

Vision Perceptor (Omnicam)

This was the original VisionPerceptor, offering a 360° view. The direction of the view (pan and tilt) can be changed, however.

The Soccerbot model uses this perceptor.

Restricted Vision Perceptor

The RestrictedVisionPerceptor limits the field of view to 120°. This became the default vision perceptor in version 0.5.

Nao possess a restricted vision perceptor at the center of it's head.

GameState Perceptor

The game state perceptor delivers several information about the actual state of the soccer game environment. A game state message is started with the GS identifier, followed by a list of different state information. Currently just the actual play time and play mode are transmitted in each cycle. Play time starts from zero at kickoff of the first half, and 300 at kickoff of the second half and is given as a floating point number in seconds, to two decimal places.

For more information about states of the game, see Play Modes.

Message format (GS (t <time>) (pm <playmode>))
<time> The current play time in seconds.
<playmode> The current play mode of the soccer game.
Example message (GS (t 0.00) (pm BeforeKickOff))
Frequency Every cycle

In future versions of the soccer simulation may extend the data provided by this perceptor to initially include game variables such as ball weight and field size, although it seems equally valid to require agents to deduce these properties of the environment on their own.

AgentState Perceptor

The AgentState perceptor gives information about the internal state of the agent. It reports information about the current battery status and the temperature of the agent.

Message format (AgentState (temp <degree>) (battery <percentile>))
Example message (AgentState (temp 48) (battery 75))

Hear Perceptor

Agent processes are not allowed to communicate with each other directly, but agents may exchange messages via the simulation server. For this purpose agents are equipped with the so-called hear perceptor, which serves as an aural sensor and receives messages shouted by other players. Actually the underlying model stems from the 2D Soccer Simulation and has been integrated in the 3D simulator since server version 0.4.

Message format (hear <time> self/<direction> <message>)
<time> The simulation time at which the given message was heard in seconds (a real number.)
<direction> Either a relative horizontal direction in degrees indicating where the sound originated, or self indicating that the player is hearing their own words. (TODO: check of the direction is relative to the orientation of the head or torso)
<message> Up to 20 characters, which may be taken from the ASCII printing character subset [0x20, 0x7E] except the white space character and the normal brackets ( and )
Example messages (hear 12.3 self helloworld)
(hear 12.3 -12.7 helloyourself)
Frequency Head capacity model

This perceptor's dual is the Say Effector.

Restrictions

The hear perceptor comes up with some restrictions:

  1. Messages are restricted to a maximal length (currently 20 bytes).
  2. Message may only consist of characters from the ASCII subset [0x21; 0x7E] excluding [0x28; 0x29] which are the parenthesis characters, ( and ).
  3. Messages shouted from beyond a maximal distance (currently 50 meters) cannot be heard. Note that as the field is currently only 20x30 metres (36 diagonally), this does not turn out to be a limit in practice.
  4. The number of messages which can be heard at the same time is bounded. Each player has the maximal capacity of one heard message by a specific team every two simulation cycles (thus every 0.04 seconds per team). There are separately tracked capacities for both teams, because teams should not be able to block the hear perceptors of their opponents by shouting permanently. If multiple messages are spoken by members of a team within two simulation cycles, only one will be heard (the first to reach the server) and the rest will be discarded. Messages shouted by oneself, though, will always be heard [Vor06].

With these restrictions, there are 91 unique values per byte. With 20 bytes there are 1.516456694×10³⁹ unique messages, amounting to around 130 bits of information if carefully encoded.

Personal tools