A closed-loop Petri Net (PN) model was developed to exhibit, maintain and, withdraw facial expressions of six basic affective states, in a human-like manner, on a robotic face. The PN model was aimed to enable execution of major facial muscles-like-interactions between segments of a latex-made facial mask. The muscle-like interactions were based upon the widely accepted and used facial action coding systems. In order to validate the PN model, the facial mask, mounted on a 3-D printed artificial skull, was used as a robotic face. Human facial muscles like movements, generated on the surface of the facial mask, were able to express positive and negative affective states. Audio signals were used as stimuli for eliciting expressions of affective states. Results show that an event driven discrete model would suffice deterministic representation of a finite number of affective states on an artificial robotic face.