Efficient Data Import into MySQL and PostgreSQL: Mastering Command-Line Techniques
Last Updated on December 21, 2023 by Editorial Team
Author(s): Abdulraqib Omotosho
Originally published on Towards AI.
In the field of database management, the ability to seamlessly import data is a crucial skill for any developer or data professional. While graphical user interfaces (GUIs) and tools like PgAdmin Workbench provide convenient interfaces for managing databases, thereβs a distinct power and efficiency in using the command line to import data directly into MySQL and PostgreSQL. This article will guide you through the command-line methods for importing data, offering a hands-on approach that not only enhances your command-line prowess but also provides a deeper understanding of the underlying processes. It will empower you to efficiently import data into MySQL and PostgreSQL, paving the way for streamlined and scriptable database interactions whether youβre a seasoned command-line enthusiast or a newcomer looking to broaden your database management skills.
PostgreSQL
PostgreSQL, often referred to as βPostgres,β stands as a robust and open-source relational database management system (RDBMS). What sets PostgreSQL apart is its extensibility, standards compliance, and advanced features, making it a powerful choice for handling diverse data scenarios. Its support for various data types, indexing options, and complex queries positions it as a versatile solution for data management. When it comes to data import into SQL databases, PostgreSQLβs reliability, scalability, and extensibility make it an ideal candidate. Its ability to handle large datasets, coupled with support for complex operations, ensures a seamless and efficient data import process, making PostgreSQL a go-to choice for those seeking a powerful, open-source solution in the realm of SQL databases.
Without further ado, letβs get to the heart of the matterβ¦.
Firstly, for Windows users, letβs navigate to the SQL Shell (psql) on the desktop. Open the application by searching for βSQL Shellβ or βpsqlβ in the Start menu. Now, for Mac users, access the Terminal application, which serves as the command line interface. You can find it in the Applications > Utilities folder or by searching for βTerminalβ using Spotlight.
Next, in the Terminal or SQL Shell, hit Enter to navigate through the Server (localhost), Database (postgres), Port (5433), and Username (postgres). Youβll then be prompted to enter the password for the database. After entering the password successfully, you should see the below.
Following the navigation steps, create a database using the command below. In this instance, Iβve named it βsalesrecordβ.
CREATE DATABASE salesrecord;
Now, letβs connect to the newly created database, βsalesrecord,β using the following command:
\c salesrecord;
Next, letβs create a table named βsalesdataβ in the database with specified column names and data types using the following command:
CREATE TABLE salesdata (
region VARCHAR(255),
country VARCHAR(255),
item_types VARCHAR(255),
sales_channel VARCHAR(255),
order_priority VARCHAR(255),
order_date DATE,
order_id BIGINT,
ship_date DATE,
units_sold BIGINT,
unit_price NUMERIC,
unit_cost NUMERIC,
total_revenue DECIMAL(10, 2),
total_cost NUMERIC,
total_profit DECIMAL(10, 2)
);
Finally, effortlessly import data into the βsalesdataβ table within seconds using the command below. Remember to specify the complete path of the file directory for seamless execution. In this instance, Iβm importing from the βDatasetsβ directory within the βPythonβ folder on the Desktop, located in the Users directory on my local drive.
COPY salesdata
FROM 'C:/Users/user/Desktop/__Python__/Datasets/500000 Sales Records.csv'
WITH (FORMAT csv, DELIMITER ',', HEADER true);
After following all the above steps, we have our data successfully imported into the table in the database in just a few seconds.
MYSQL
MySQL, often called MySQL, is a strong and open-source database system. Itβs known for being adaptable, following standards, and having advanced features, making it great for handling different types of data. With support for various data types and efficient ways of organizing data, MySQL is a versatile choice. When it comes to importing data into SQL databases, MySQL is reliable, scalable, and flexible. It can handle large amounts of data and complex operations, ensuring a smooth and effective importing process. MySQL is a top pick for those seeking a powerful, open-source solution for SQL databases.
Without further ado, letβs get to the heart of the matterβ¦.
For Windows users, search for βCommand Promptβ in the Start menu, then right-click on βCommand Promptβ and select βRun as administratorβ to launch the MySQL Command Line Client with elevated privileges.
Navigate to the MySQL command line directory on your local drive using the terminal. In my case, I achieved this by executing the following command. This command ensures that you are in the correct directory for accessing MySQLβs command-line tools.
cd C:\Program Files\MySQL\MySQL Server 8.0\bin
Next, run the below command, which prompts you to enter your password.
mysql -u root -p --local_infile
After entering the password successfully, you should see the below.
Following the navigation steps, create a database using the command below. In this instance, Iβve named it βsalesrecordβ.
CREATE DATABASE salesrecord;
Now, letβs connect to the newly created database, βsalesrecord,β using the following command:
USE salesrecord;
Enabling βSET GLOBAL local_infile = βONβ;β becomes essential when efficient data loading from a local file into a MySQL table using the βLOAD DATA LOCAL INFILEβ statement is desired. This feature streamlines the data import process, contributing to a smoother user experience. However, itβs important to be aware that activating βlocal_infileβ introduces security considerations, particularly if the MySQL server is accessible over a public network. Once the data import is completed, the βlocal_infileβ option should be promptly disabled by executing βSET GLOBAL local_infile = βOFFβ;β to minimize potential vulnerabilities and ensure a secure MySQL environment. Users are encouraged to enable βlocal_infileβ selectively for specific tasks and disable it promptly afterward, prioritizing security within their MySQL setups.
SET GLOBAL local_infile = 'ON'
Next, letβs create a table named βsalesdataβ in the database with specified column names and data types using the following command:
CREATE TABLE salesdata (
region VARCHAR(255),
country VARCHAR(255),
item_types VARCHAR(255),
sales_channel VARCHAR(255),
order_priority VARCHAR(255),
order_date DATE,
order_id BIGINT,
ship_date DATE,
units_sold BIGINT,
unit_price DECIMAL(10, 2),
unit_cost DECIMAL(10, 2),
total_revenue DECIMAL(10, 2),
total_cost DECIMAL(10, 2),
total_profit DECIMAL(10, 2)
);
Finally, effortlessly import data into the βsalesdataβ table within seconds using the command below. Remember to specify the complete path of the file directory for seamless execution. In this instance, Iβm importing from the below directory in my local drive.
LOAD DATA LOCAL INFILE 'C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Uploads\\500000 Sales Records.csv'
INTO TABLE salesdata
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
After completing the aforementioned steps, our data has been seamlessly imported into the database table within a matter of seconds.
In conclusion, this tutorial has equipped you with essential command-line skills for swift and efficient data management in both MySQL and PostgreSQL. By steering clear of the sluggish import processes associated with graphical interfaces like Workbench or PgAdmin, this tutorial champions the command line for a faster and more efficient data import experience. Ditch slow clicks and embrace speedy command lines.
Happy querying!!
Thanks for reading β I hope you found this piece enjoyableU+1F917! Connect with me on X (formerly Twitter), GitHub or LinkedIn and explore more of my stories here. Cheers!
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI