⚠️ This post links to an external website. ⚠️
Analyzing large amounts of data has become an everyday problem. Oftentimes, the datasets are only available as CSV file, which creates the the question of how you can import them into your Postgres database? The short answer: By using the Postgres’
COPY-function. Here’s the long answer:Let’s imagine you have an
Ecto.SchemacalledLocationwith the following definition:The
locationstable stores location data as latitude and longitude coordinates together with aname. For example, this could be a street address with a house number and its geocodedlat+longposition.Now, let’s imagine you have a CSV file called
locations.csvwith 100.000 rows of location data and you want to import the data into yourlocationstable. In the following, we will use Postgres’COPY-function for that. First, we will call it directly from apsql-session and then we will wrap it into a simpleMix.Task. Let’s go!
continue reading on peterullrich.com
If this post was enjoyable or useful for you, please share it! If you have comments, questions, or feedback, you can email my personal email. To get new posts, subscribe use the RSS feed.
