How To Get Url In Python

In this tutorial, you will learn how to get a URL in Python using the popular library requests. This library is widely used for fetching data from URLs, handling APIs, and web scraping. We will go through the steps on how to install and use the requests library to get URL content in Python.

Step 1: Install requests library

Before you get started, ensure that you have the latest version of Python installed on your computer. You can check your Python version by running the command:

If you have Python installed, you can proceed to install the requests library. You can install the library using the pip command:

Step 2: Import requests library

Now that the requests library is installed, you should import it into your Python script:

Step 3: Fetching URL content – Get request

To fetch the content of a URL using the requests library, you need to use the get() method. This method takes the URL as its argument and returns the response:

Make sure to replace with the desired URL you want to fetch.

Step 4: Accessing response content and status

After making a GET request, you can access the content of the response using the text property and the status code using the status_code property. Here’s an example:

In this example, we are fetching the content of “” and printing its content and status code.

Full Code:


The output is url content and status code.

Status code:

The output shows the content of the URL (HTML code in this case) and the status code of the request (200, meaning successful).

Step 5: Handling errors and exceptions

Sometimes, you might encounter errors like invalid URLs or network issues while fetching a URL in Python. To handle these errors, you should use exception handling using the try and except blocks:

With these steps, you should be able to fetch URLs in Python using the requests library. Now you can move forward with making API requests, web scraping, or any other web-related tasks.


In this tutorial, you have learned how to get a URL in Python using the requests library. You have installed and imported the library, fetched a URL, accessed its content and status, and learned how to handle errors and exceptions. With this knowledge, you can now proceed to make API requests, web scraping, and dynamic web-based applications.