Migrating Large Amounts of Data by Using Streams
暂无分享,去创建一个
In the preceding chapter, we made sure that our users will enjoy our CLI. We also published our first release. Now we want to take a look at a more advanced topic: streams. Streams are a very powerful feature in Node.js for processing large amounts of data. With traditional buffering, we quickly run into memory problems, because all the data just doesn’t fit into the memory of the computer. Streams enable us to process data in small slices. Node.js streams work like Unix streams on the terminal, where you pipe data from a producer into a consumer by using the pipe symbol (|). We will take a look at how piping works by exploring the cat command. Afterward, we will create our own streams and integrate them into our command-line client.