A website is provieded in the code. The crawler goes through the source code and gathers all URL's inside. The crawler then visits each url in another for loop and gather all child url's from the initial parent urls. And we write the child urls to a file called depth_1.