UNPKG

crawler

Version:

Crawler is a ready-to-use web spider that works with proxies, asynchrony, rate limit, configurable request pools, jQuery, and HTTP/2 support.

10 lines 299 B
declare class multiPriorityQueue<T> { private _elements; private _size; constructor(priorities: number); size(): number; enqueue(value: T, priority: number): void; dequeue(): T | undefined; } export default multiPriorityQueue; //# sourceMappingURL=multiPriorityQueue.d.ts.map