Working with big files in golang
Today, I am going to show you how to read files in golang line-by-line. Let's imagine that have a jsonl
file. What's jsonl? it's json lines in a simple term, it's a file that each line of it represents a valid json
object. So if we read the file line by line, we can Marshal/Unmarshal each line of it separately. Here's an example of a jsonl
file.
Each line of this file represents the data of a world cup.
{"year":"2018","host":"Russia","winner":"France"}
{"year":"2014","host":"Brazil","winner":"Germany"}
{"year":"2010","host":"South Africa","winner":"Spain"}
{"year":"2006","host":"Germany","winner":"Italy"}
Working with bufio.Scanner
So let’s read this file line by line most easily and conveniently. the easiest way to read a file (at least for me) is using the scanner from the bufio package in the standard library. First, we need to create an instance with the NewScanner
function which is a very familiar way for constructing the structs in golang. this function accepts a Reader interface as input, and the good news is os.File
implemented this interface. It means, we can open a file and pass the pointer of the file to bufio.NewScanner. Let’s see it in action.
// first open the file
file, err := os.Open("/Users/Mark/fifa-winners.jsonl")
if err != nil {
log.Fatalf("could not open the file: %v", err)
}
// don't forget to close the file.
defer file.Close()
// finally, we can have our scanner
scanner := bufio.NewScanner(file)
So, we have the scanner, we are ready to go... scanner has a function named Scan. this function moves the scanner to the next token. I'll tell you what it means but for now, let's say each time Scan called, we read one line of our file. So if we want to move the scanner all through the file, we'd call the scan function in an infinite loop! The question is How do we know, we can break the loop? It's easy! Scan returns true as the return unless it meets the end of the file.
for {
if scanner.Scan() {
// we have a new line in each iteration
continue
}
// we are done let's break the loop
break
}
// the rest of our spaghetti
This code works, but have you know we can say golang to keep a loop running until met a specific condition. All the code above can be as simple as the code below:
for scanner.Scan() {
// we have a new line in each iteration
}
// the rest of our spaghetti
Well, Let's get back to the point! we can have our bytes or string in each line easily by calling Bytes()
and Text()
functions.
for scanner.Scan() {
// b is an array of bytes ([]byte)
b := scanner.Bytes()
// s is string
s := scanner.Text()
}
Frankly, These functions are the same! for example string(scanner.Bytes())
will give you the same result and that's what exactly happens in the Text()
function.
We read our file, so is the mission completed? Not exactly, because we didn't handle any error yet.
the scanner has another function called Err()
. This function gives you the first error that happened during the scan process. It means, when the scanner trying to move through the file, the Scan function returns false, if something bad happened. So our loop instantly breaks and we will out of the loop. Now we can get that error and deal with it.
If we want to know in each line of the file that error happened, we should use a traditional way, we know the scanner starts from the beginning of the file (line number 1) so we can define a variable outside of the loop that represents the line number and increase it in each iteration.
lineNumber := 0
for scanner.Scan() {
lineNumber++
fmt.Println(scanner.Text())
}
// the rest of our spaghetti
if err := scanner.Err(); err != nil {
log.Fatalf("something bad happened in the line %v: %v", lineNumber, err)
}
another thing that we should consider about the Err()
function is it ignores the io.EOF
so if we will give an error, it's a REAL one!
Let's have a run:
➜ big-files (main) ✗ go run main.go
{"year":"2018","host":"Russia","winner":"France"}
{"year":"2014","host":"Brazil","winner":"Germany"}
{"year":"2010","host":"South Africa","winner":"Spain"}
{"year":"2006","host":"Germany","winner":"Italy"}
It worked, So what's next?
Fix bufio.Scanner: token too long error
We said, each line of a jsonl file represents a valid json, so it could be too long. We also said the Scan
function moves the scanner to the next token but the question is where is the next token!?
The scanner function has another method that is not as famous as its siblings, Buffer
and it needs a buffer and an integer as input and you can set the maximum size of the buffer with this function.
bufio package has a maximum token size which equals 64 * 1024
(~65.6kb). So if one line of our lines is bigger than this size, we got this error token too long error
.
We found the answer to our question: The next token is where the scanner reaches max size (default 65kb) OR the end of the line.
Approach 1: Bigger buffer size
The first approach to tackling this problem is to increase the buffer size. Actually, the name bufio.MaxScanTokenSize
is a little misleading because it's not the actual maximum it's THE DEFAULT MAXIMUM size. so we can increase it.
buf := []byte{}
scanner := bufio.NewScanner(file)
// increase the buffer size to 2Mb
scanner.Buffer(buf, 2048*1024)
Now we can process jsonl
files with lines up to 2Mb. It's good but what if we need more? We can increase this number as much as we want (probably) but if our file has 5.000.000 rows and just one of them is 100Mb we need to increase our scanner to this size just for one line, or use another approach!
Approach2
the next way to read such a tough file! is using another function! Bufio (buffer-io) gave us way more than a simple scanner to work with files and we have to choose one of them based on our needs and requirements. in this case, Scanner cannot satisfy what we need so let's take a look at ReadLine
function of bufio.Reader
. It's a little bit lower level than the scanner. generally speaking, when you hear the word lower-level
you should do more for simple things but you have more access and power!
So let's get started. First we need a reader:
reader := bufio.NewReader(file)
reader, has ReadLine function which tries to read the entire line. Just like the scanner, we need to call this function in a for loop but since we are at the lower level! we don't have a nice-easy boolean in return anymore to know that we can break the loop.
the other difference is the error that we will give from the ReadLine
function, which can also be io.EOF
. It's not going to be a real error for us, so we have to handle it too.
reader := bufio.NewReader(file)
for {
line, _, err := reader.ReadLine()
if err != nil {
if err == io.EOF {
break
}
log.Fatalf("a real error happened here: %v\n", err)
}
fmt.Println(string(line))
}
As you probably already know, We just read the file so far, we did actually solve the problem that we had with the gigantic lines.
we ignore the second parameter that we gave from the ReadLine function and that one is what we exactly need to solve our problem. It's a boolean named isPrefix
. If the line is too long and ReadLine
cannot put all of its content in the buffer, It returns the filled buffer and set isPrefix to true which means we will give the next part of the line in the next call of the ReadLine
function.
So we just need to call the ReadLine function until isPrefix
becomes false
then we can go for the next line of our file. You probably already noticed that we are talking about a recursive function. First I define the function that we want to call recursively.
func read(r *bufio.Reader) ([]byte, error) {
var (
isPrefix = true
err error
line, ln []byte
)
for isPrefix && err == nil {
line, isPrefix, err = r.ReadLine()
ln = append(ln, line...)
}
return ln, err
}
isPrefix
is true at the first place and error is also nil so we make sure the for loop will run at least one time. It behaves like the do-while loop. We re-assign variables inside the loop so we call r.ReadLine
unless we got an error OR isPrefix
is false. in each iteration, we append the bytes that we get from r.ReadLine()
to another variable. Now it's time to call this function inside the main function.
reader := bufio.NewReader(file)
for {
line, err := read(reader)
if err != nil {
if err == io.EOF {
break
}
log.Fatalf("a real error happened here: %v\n", err)
}
fmt.Println(string(line))
}
That's it! We solve the problem. here's the complete code:
package main
import (
"bufio"
"fmt"
"io"
"log"
"os"
)
func main() {
// first open the file
file, err := os.Open("./fifa-winners.jsonl")
if err != nil {
log.Fatalf("could not open the file: %v", err)
}
defer file.Close()
log.Println("******************* READ WITH SCANNER *******************")
readWithScanner(file)
log.Println("******************* READ WITH READLINE() *******************")
// we just reset the offset. because we read this file once
// imagine the cursor is in the end of the file so we have to get back to the first line and read it again
file.Seek(0, 0)
readWithReadLine(file)
log.Println("we read a file twice!")
}
// Read with simple scanner
func readWithScanner(file *os.File) {
// first open the file
file, err := os.Open("./fifa-winners.jsonl")
if err != nil {
log.Fatalf("could not open the file: %v", err)
}
// finally, we can have our scanner
buf := []byte{}
scanner := bufio.NewScanner(file)
scanner.Buffer(buf, 2048*1024)
lineNumber := 1
for scanner.Scan() {
fmt.Println(scanner.Text())
lineNumber++
}
// the rest of our spaghetti
if err := scanner.Err(); err != nil {
log.Fatalf("something bad happened in the line %v: %v", lineNumber, err)
}
}
// Read with Readline function
func read(r *bufio.Reader) ([]byte, error) {
var (
isPrefix = true
err error
line, ln []byte
)
for isPrefix && err == nil {
line, isPrefix, err = r.ReadLine()
ln = append(ln, line...)
}
return ln, err
}
func readWithReadLine(file *os.File) {
reader := bufio.NewReader(file)
for {
line, err := read(reader)
if err != nil {
if err == io.EOF {
break
}
log.Fatalf("a real error happened here: %v\n", err)
}
fmt.Println(string(line))
}
}