Secondary Reinforcer refers to a stimulus that gains reinforcing properties because it is associated with a primary reinforcer.

Under this conception, money is a secondary reinforcer because having it allows greater access to primary reinforces such as food, clothing, etc. Variable-Interval Schedule - A schedule of reinforcement or punishment in which the reinforcer or punisher is presented after a variable amount of time.
Variable-Ratio Schedule - A schedule of reinforcement or punishment in which the reinforcer or punisher is presented after a variable number of responses (e.g. Lever- presses).


Other /More definition:
Secondary Reinforcer refers to a stimulus that gains reinforcing properties because it is associated with a primary reinforcer. Under this conception, money is a secondary reinforcer because having it allows greater access to primary reinforces such as food, clothing, etc. Variable-Interval Schedule - A schedule of reinforcement or punishment in which the reinforcer or punisher is presented after a variable amount of time.
Variable-Ratio Schedule - A schedule of reinforcement or punishment in which the reinforcer or punisher is presented after a variable number of responses (e.g. lever- presses).