Download presentation
Presentation is loading. Please wait.
Published byShanon Underwood Modified over 9 years ago
1
Harshda Vabale Aneeta Kolhe
2
This project actually extracts entire data from the website and then stores it on your local machine. This application can be used to extract a URL contents and it subdirectories. It will work behind firewall. The progress will update in the status window
3
We have used GNU Wget which is a package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It will create a folder of the of the URL name. This folder contains web pages and the images of the URL.
4
Enter the URL in the address bar. Set the location where you want the file to be downloaded. Then hit the EXTRACT DATA button. It will show the status and the number of files downloaded from the specified URL. After moving all the files to the folder, it will show you the message that the data has been extracted and moved to the specified folder.
5
When we try to extract data from the website www.timesofindia.com, it extracts all the information that is available on the present page. The folders that were extracted were classifieds, entertainment, RSS feeds, life style, sports, world and the index page is downloaded. Basically here we are extracting and downloading every URL link’s information and the images related to the URL.
7
Windows Vista.Net Framework 3.0 Microsoft Visual C #
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.