Weird normal graph, what’s happening?


This appears to be a very basic question for this Stack Exchange, but hopefully still welcome. I am a math teacher and wanted to write a program to demonstrate normal distributions. I wrote a simple code in Processing that essentially flips a coin a certain number of times and makes a histogram of the results:

int piles = 200; int[] amounts = new int[piles]; int counter = 0; int maximum = 2000; void setup(){   size(1000,1000);   for(int i=0;i<piles;i++){     amounts[i] = 0;   } }  void draw(){   if(counter < maximum){     int value = 0;     for(int i=0;i<piles;i++){       if((int)random(0,2) == 0){         value++;       }     }     amounts[value]++;     for(int i=0;i<piles;i++){       for(int k=0;k<amounts[i];k++){         ellipse((1000*i+500)/piles,(1000*k+500)/piles,1000/piles,1000/piles);       }     }     counter++;   } } 

The strange thing is that the histogram always ends up being very "pointy". Does this happen because of the random() method, are normal distributions pointier than I thought, or am I missing something stupid?

https://i.stack.imgur.com/DZGqu.png