Importing data manually into your database especially when its a number of tables, can not only be tiresome but also time consuming. This can be made easier by use of python libraries.
Download painting dataset from kaggle. Paintings data set is made up of 8 csv files that we will import to our database by use of a simple python script, instead of importing data to our database table manually.
create database painting
pip install sqlalchemy pip install pandas
import pandas as pd from sqlalchemy import create_engine
conn_string = 'postgresql://postgres:1344@localhost/painting' db = create_engine(conn_string) conn = db.connect()
In conn_string stores url to our database where postgresql is our database since we created database in pg-admin, postgres is default name of our database name pg-admin, 1344 is password to our pg-admin, @localhost is host name and painting is our database name
files = ['artist', 'canvas_size', 'image_link', 'museum', 'museum_hours', 'product_size', 'subject', 'work'] for file in files: df = pd.read_csv(fr"C:\Users\Admin\Desktop\famous painti\{file}.csv") df.to_sql(file, con = conn, if_exists='replace', index = False)
files is the names we wish to name our tables in painting database. for loop is used so that we can iterate over our files. fr in pd.read csv represents f string and raw data. Index must be set to false to not import default pandas index.
We have learned how to use Python and MySQL Connector to create an entirely new database in MySQL Server, create tables within that database instead of doing it manually.
The above is the detailed content of IMPORTING DATA USING PYTHON TO MYSQL. For more information, please follow other related articles on the PHP Chinese website!