How can a website know that I’m not a bot?

I know that Same-Origin-Policy only apply between websites, but how does web-servers really knows that I sent the request from a website, I guess that one of the things they know it is by the user-agent.

I setup an HTTP Server with Python, and by default HTTP server in Python does not allow any origins to access his data.

Http Server

As you can see, I commented out the header that I set. Without it, the server does not allow any origin to access his data.

from http.server import HTTPServer, SimpleHTTPRequestHandler, test  class CORSRequestHandler(SimpleHTTPRequestHandler):     def end_headers(self):         # self.send_header('Access-Control-Allow-Origin', '*') # wildcard means all         SimpleHTTPRequestHandler.end_headers(self)  if __name__ == '__main__':     test(CORSRequestHandler, HTTPServer, port=9000) 

Request File

Above I asked you "how do website know that I come from other websites". I guess the answer: "user-agent". But here, I made an Http GET request with Python, and I set the user-agent header to look normal like I browse with a browser. But I can still get the response. SOP doesn’t block me.

import requests  headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36'} resp = requests.get('', headers=headers) print(resp.content) 

So, how do websites knows from where I browsing ?