Node.js介绍2-流

都学C++的STL中的IOStream,输入输出流,看个代码

using namespace std;
cout<<x;

眼角湿润了,这是大学的记啊,大学时我们幸苦的念C++,为了指针的假释及指针的指针搞的一筹莫展,更何况记忆受到不仅起代码,还有我之年青和它。算了,搬砖的腰身又酸了,还是回现实,看看node.js中之流吧。

什么是流啊。。。

流顾名思义就是流水的意思,stream英文为是溪流。如果管二进制数据由一个地方非常源源不断的送至其它一个地方,像水流一样的力量,就叫流。

A stream is an abstract interface implemented by various objects in
Node.js. For example a request to an HTTP
server
is a stream, as is process.stdout .Streams
are readable, writable, or both. All streams are instances of
EventEmitter.

愿意啃文档的兄弟可以扣押stream

stream的例子

因node.js非常擅长处理数据(这里数据的或是是服务器的网页,或者返回的json数据,或者其他事物),所以我们来探视有例子,说明stream对服务器的重中之重作用。node.js里面很多好像都是继往开来了流动的接口。

始建一个echo服务

echo是回声的意思,我们本着正值非常山喊话,回闻自己喝的声响,这边我们开只服务器涉及这个无聊之事务吧。

var http = require('http');
http.createServer(function(request, response) {
    response.writeHead(200);
    request.pipe(response);
}).listen(8080);

运行后调用curl -d 'hello' http://localhost:8080。简直不敢相信服务器这么简单就写好了,这就是node.js的魅力吧。
地方的pipe就是管道的意思,和linux的命令行|一个意思,大家应该熟悉命令行的管道吧,概念都是相通之。大家应该明了这个

gulp

便是冲stream来做的。

上传文件

咱们当看一个臻污染文书之例证。

var http = require('http');
var fs = require('fs');
http.createServer(function(request, response) {
    var newFile = fs.createWriteStream("copy" + new Date() + ".md");
    var fileBytes = request.headers['content-length'];
    var uploadedBytes = 0;

    response.write("server receive request\n");
    request.pipe(newFile);

    request.on('readable', function() {
        var chunk = null;
        response.write("progress: start\n");
        while (null !== (chunk = request.read())) {
            uploadedBytes += chunk.length;
            var progress = (uploadedBytes / fileBytes) * 100;
            response.write("progress: " + parseInt(progress, 10) + "%\n");
        }
    });


    request.on('end', function() {
        response.end('uploaded!\n');
    });

}).listen(8080);
//curl --upload-file uploadFiles.js http://localhost:8080

上传文件例子

这边的看点是

  1. 怎么样回到进度的:request.on('readable', function() {,有没有发看这种异步I/O方式的助益。
  2. 如何保存文件request.pipe(newFile);,是无是老有益。

流动的兑现

点我们看来流的布局的简单容易用,现在咱们省node.js的流是怎么设计的。

To implement any sort of stream, the pattern is the same:

  1. Extend the appropriate parent class in your own subclass.
    (Theutil.inherits()
    method is particularly helpful for this.)
  2. Call the appropriate parent class constructor in your constructor,
    to be sure that the internal mechanisms are set up properly.
  3. Implement one or more specific methods, as detailed below.

The class to extend and the method(s) to implement depend on the sort of
stream class you are writing:

注的功底

翻译一下流实现之进程:

  1. 后续合适的class
  2. 无须遗忘调用基类构造函数
  3. 重写基类方法

数数的可读流

关押一个例子就是亮了,下面就段先后即使是几度累,1累及1000000。

const Readable = require('stream').Readable;
const util = require('util');
util.inherits(Counter, Readable);

function Counter(opt) {
    Readable.call(this, opt);
    this._max = 1000000;
    this._index = 1;
}

Counter.prototype._read = function() {
    var i = this._index++;
    if (i > this._max)
        this.push(null);
    else {
        var str = '' + i;
        var buf = new Buffer(str, 'ascii');
        this.push(buf);
    }
};

///////////////////////////////////////////////////////////
//test 
var fs = require('fs');
var newFile = fs.createWriteStream("test_counter.txt");
var myCounter = new Counter();
myCounter.pipe(newFile);

方的Counter完成了三部曲,测试程序把这个conter输出及文件。如果我们怀念协调实现一个流动,这样就得了。如果地方例子太简单了,我们看一下复杂点的例子,比如transform

啥是transform流

Transform streams are
Duplex
streams where the output is in some way computed from the input. They
implement both the
Readable
and
Writable
interfaces.
Examples of Transform streams include:
zlib
streams
crypto
streams
翻译一下就是是为此来把输入流变化一下,再出口。比如减少,加密相当。

const gzip = zlib.createGzip();
const fs = require('fs');
const inp = fs.createReadStream('input.txt');
const out = fs.createWriteStream('input.txt.gz');

inp.pipe(gzip).pipe(out);

实现transform流

这事例解析一个数额,产生一个readable
stream,这个stream是通过变换的啊。

  1. 分析的格式:有星星点点只转移行符的数据流,换行符前面是条,后面是情
格式
  1. 分析的经过遭到发出一个波header,用来展示脑部信息
  2. 最终去掉头部,保留内容信息
    现在来拘禁一下代码吧。

const util = require('util');
const Transform = require('stream').Transform;
util.inherits(SimpleProtocol, Transform);

function SimpleProtocol(options) {
  if (!(this instanceof SimpleProtocol))
    return new SimpleProtocol(options);

  Transform.call(this, options);
  this._inBody = false;
  this._sawFirstCr = false;
  this._rawHeader = [];
  this.header = null;
}

SimpleProtocol.prototype._transform = function(chunk, encoding, done) {
  if (!this._inBody) {
    // check if the chunk has a \n\n
    var split = -1;
    for (var i = 0; i < chunk.length; i++) {
      if (chunk[i] === 10) { // '\n'
        if (this._sawFirstCr) {
          split = i;
          break;
        } else {
          this._sawFirstCr = true;
        }
      } else {
        this._sawFirstCr = false;
      }
    }

    if (split === -1) {
      // still waiting for the \n\n
      // stash the chunk, and try again.
      this._rawHeader.push(chunk);
    } else {
      this._inBody = true;
      var h = chunk.slice(0, split);
      this._rawHeader.push(h);
      var header = Buffer.concat(this._rawHeader).toString();
      try {
        this.header = JSON.parse(header);
      } catch (er) {
        this.emit('error', new Error('invalid simple protocol data'));
        return;
      }
      // and let them know that we are done parsing the header.
      this.emit('header', this.header);

      // now, because we got some extra data, emit this first.
      this.push(chunk.slice(split));
    }
  } else {
    // from there on, just provide the data to our consumer as-is.
    this.push(chunk);
  }
  done();
};

// Usage:
var fs = require('fs');
const source = fs.createReadStream('input.txt');
const out = fs.createWriteStream('output.txt');

var parser = new SimpleProtocol();

// Now parser is a readable stream that will emit 'header'
// with the parsed header data.
source.pipe(parser).pipe(out);
parser.on('header',function(header){
  console.log(header);
});

虽然代码长了点,但是发生注释,我就是无解释了,注意最后怎么用的啊。看看运行的结果吧。

运转结果

流就介绍及此地了,如果还完全犹不直,可以省node的源码node in
github要文档stream