I'll try to explain why the concept of a 300 Euro note is, at least intuitively, nonsense.
1.With a decimal currency the main unit, for example, the Euro, Guilder, Dollar or Pound, can be divided into 20's or 25's (as well as 50's, 10's, 5's, 2's and 1's. (Dividing into 30's would be bizarre).
2. The subdivisions are repeated (often with omissions) in the multiples of the main unit. So in the Netherlands, you had coins for 2.5 and 5 Guilders and notes for 10, 25, 50, 100, 250, 500 and 1000 Guilders. All that is/was highly consistent.

If the subdivisions are 1, 2, 5, 10, 20, 50 (as with the pound, for example) then one would expect the multiples of the pound to be 2, 5, 10, 20, 50, etc. (In fact, the £50 note is the highest issued).
3. In practice, few currencies are issued with a wholly consistent set of multiples. For example, the US abandoned first the $2.5 coins and more recently the $2 dollar note. Moreover, the $20 notes doesn't 'mirror' any 20c coin.
4. Where there are such inconsistencies, they (nowadays) are invariably a matter of *omission* (as in the American sytem) and not a matter of adding denominations ..
5. The reference to the pre-decimal pound was an arcane point -simply that 3's (and 30's, 300's) *would* only make sense in a currency based on 12, 24, etc.