This command took only a second to run and imported almost 70k rows of location data! Amazing! If you need timestamps, but can’t add them to the file, consider adding your timestamp fields after you imported the data with for example:Īpp_dev=# COPY locations(name,latitude,longitude)įROM './relative-or-absolute-path-to-your-file/data.csv' Importing a file without these timestamps will fail because of the Not-Null-constraints on the timestamp fields. Inserted_at and updated_at timestamps won’t be auto-generated. In all cases, Postgres’ COPY is mind-bogglingly fast, but it has some limitations: We’ll see how we can do that using Elixir later on. In that case, you can either use COPY’s little brother, the \copy-command, or we leverage a Postgrex-connection to stream the file through our Postgres session. However, this won’t work when you want to dump the data into a hosted database since the file exists only in your local filesystem. You can use it in two ways: If your CSV file exists in the same file-system as your Postgres database, you can ask Postgres to simply move the data into the database. The fastest way of importing your data is to use Postgres’ COPY function. Let’s go! ? Most efficient: Postgres’s COPY First, we will call it directly from a psql-session and then we will wrap it into a simple Mix.Task. In the following, we will use Postgres’ COPY-function for that. Now, let’s imagine you have a CSV file called locations.csv with 100.000 rows of location data and you want to import the data into your locations table. For example, this could be a street address with a house number and its geocoded lat long position. The locations table stores location data as latitude and longitude coordinates together with a name. Schema "locations" do field :name, :string field :latitude, :float field :longitude, :float end
0 Comments
Leave a Reply. |