Python is one of the most growing computer languages in recent times due to its easiness and demand. Every time we write a script, we need to execute it but if we use Automation then we don’t need to execute scripts every time, it’ll be done by Automation in a specific time period which s set by us according to our requirement. In this Article, you’ll find some Python libraries which can make Automation easy.
Usually, when you want to automate something, you might across APIs. Requests is a simple to use HTTP library for Python users that allows you to make requests and interact with APIs. To replicate a request easily, you need to open the network tab in Google Chrome, select the request you want to repeat, click Copy as cURL and then paste the cURL command to some converter and get the Python code you need.
You can refer to its documentation: https://docs.python-requests.org/en/latest/
2. APSCHEDULER (TASK SCHEDULING LIBRARY FOR PYTHON):
When you make some API requests, you usually want to run it periodically like every fifteen minutes or so. For this kind of case, you need to use a package or library called Advance Python Scheduler. This library has very nice documentation as well, which will allow you to take it into use very fast. I used APScheduler when building screenshot automation using Python and Google Calendar Projects.
You can refer to its documentation: https://apscheduler.readthedocs.io/en/3.x/
CSV stands for Comma-Separated Values and it is one of the most common for spreadsheets. This package or library can be very handy when you try to fill your tax papers, while doing some basic accounting, data processing, working on excel sheets, and so on. There are many possibilities, and the format allows you to open the file on programs like Excel, Google Sheets, etc.
You can refer to its documentation: https://docs.python.org/3/library/csv.html
How often do you want to observe some folder and do action based on that? watchdog is a module that allows us to watch the file system. Got messy Desktop? Use watchdog.
You can refer to its documentation: https://pypi.org/project/watchdog/
5. SELENIUM, BEAUTIFULSOUP, SCRAPY:
It’s hard to imagine automation without web scraping. For example, let’s say you want to keep track of prices on Amazon or automate ordering food from a restaurant. For these kinds of cases, it might be better to use web scraping tools.
Since Selenium is also used for test automation, you can use it for doing actions on websites such as filling forms, clicking buttons, etc. BeautifulSoup is good for simple projects.
Scrapy might seem a bit complex to learn but keep in mind that it suits complex projects and is significantly faster. So do your research and find which one fits your needs.
You can refer to its documentation: For BeautifulSoup, https://www.crummy.com/software/BeautifulSoup/bs4/doc/
For Scrapy, https://scrapy.org/
For Selenium, https://selenium-python.readthedocs.io/
Let’s say you build automation to track prices. How would you get informed when prices are at the level that you want them to be? In these kinds of cases, you can use the Twilio library which allows sending text messages and making calls.
You can refer to its documentation: https://www.twilio.com/docs/libraries/python
7. RANDOM USER AGENT:
The last interesting library for automation is called Random User Agent. This library allows you to add random user agents to your requests. Using random user agents should help a bit so that you don’t get caught while scraping data or sending many requests.
You can refer to its documentation: https://pypi.org/project/random-user-agent/