✨ MAJOR FEATURES: • Auto-zoom intelligence với smart bounds fitting • Enhanced 3D GPS markers với pulsing effects • Professional route display với 6-layer rendering • Status-based parking icons với availability indicators • Production-ready build optimizations 🗺️ AUTO-ZOOM FEATURES: • Smart bounds fitting cho GPS + selected parking • Adaptive padding (50px) cho visual balance • Max zoom control (level 16) để tránh quá gần • Dynamic centering khi không có selection 🎨 ENHANCED VISUALS: • 3D GPS marker với multi-layer pulse effects • Advanced parking icons với status colors • Selection highlighting với animation • Dimming system cho non-selected items 🛣️ ROUTE SYSTEM: • OpenRouteService API integration • Multi-layer route rendering (glow, shadow, main, animated) • Real-time distance & duration calculation • Visual route info trong popup 📱 PRODUCTION READY: • SSR safe với dynamic imports • Build errors resolved • Global deployment via Vercel • Optimized performance 🌍 DEPLOYMENT: • Vercel: https://whatever-ctk2auuxr-phong12hexdockworks-projects.vercel.app • Bundle size: 22.8 kB optimized • Global CDN distribution • HTTPS enabled 💾 VERSION CONTROL: • MapView-v2.0.tsx backup created • MAPVIEW_VERSIONS.md documentation • Full version history tracking
fastq
Fast, in memory work queue.
Benchmarks (1 million tasks):
- setImmediate: 812ms
- fastq: 854ms
- async.queue: 1298ms
- neoAsync.queue: 1249ms
Obtained on node 12.16.1, on a dedicated server.
If you need zero-overhead series function call, check out fastseries. For zero-overhead parallel function call, check out fastparallel.
Install
npm i fastq --save
Usage (callback API)
'use strict'
const queue = require('fastq')(worker, 1)
queue.push(42, function (err, result) {
if (err) { throw err }
console.log('the result is', result)
})
function worker (arg, cb) {
cb(null, arg * 2)
}
Usage (promise API)
const queue = require('fastq').promise(worker, 1)
async function worker (arg) {
return arg * 2
}
async function run () {
const result = await queue.push(42)
console.log('the result is', result)
}
run()
Setting "this"
'use strict'
const that = { hello: 'world' }
const queue = require('fastq')(that, worker, 1)
queue.push(42, function (err, result) {
if (err) { throw err }
console.log(this)
console.log('the result is', result)
})
function worker (arg, cb) {
console.log(this)
cb(null, arg * 2)
}
Using with TypeScript (callback API)
'use strict'
import * as fastq from "fastq";
import type { queue, done } from "fastq";
type Task = {
id: number
}
const q: queue<Task> = fastq(worker, 1)
q.push({ id: 42})
function worker (arg: Task, cb: done) {
console.log(arg.id)
cb(null)
}
Using with TypeScript (promise API)
'use strict'
import * as fastq from "fastq";
import type { queueAsPromised } from "fastq";
type Task = {
id: number
}
const q: queueAsPromised<Task> = fastq.promise(asyncWorker, 1)
q.push({ id: 42}).catch((err) => console.error(err))
async function asyncWorker (arg: Task): Promise<void> {
// No need for a try-catch block, fastq handles errors automatically
console.log(arg.id)
}
API
fastqueue()queue#push()queue#unshift()queue#pause()queue#resume()queue#idle()queue#length()queue#getQueue()queue#kill()queue#killAndDrain()queue#error()queue#concurrencyqueue#drainqueue#emptyqueue#saturatedfastqueue.promise()
fastqueue([that], worker, concurrency)
Creates a new queue.
Arguments:
that, optional context of theworkerfunction.worker, worker function, it would be called withthatasthis, if that is specified.concurrency, number of concurrent tasks that could be executed in parallel.
queue.push(task, done)
Add a task at the end of the queue. done(err, result) will be called
when the task was processed.
queue.unshift(task, done)
Add a task at the beginning of the queue. done(err, result) will be called
when the task was processed.
queue.pause()
Pause the processing of tasks. Currently worked tasks are not stopped.
queue.resume()
Resume the processing of tasks.
queue.idle()
Returns false if there are tasks being processed or waiting to be processed.
true otherwise.
queue.length()
Returns the number of tasks waiting to be processed (in the queue).
queue.getQueue()
Returns all the tasks be processed (in the queue). Returns empty array when there are no tasks
queue.kill()
Removes all tasks waiting to be processed, and reset drain to an empty
function.
queue.killAndDrain()
Same than kill but the drain function will be called before reset to empty.
queue.error(handler)
Set a global error handler. handler(err, task) will be called
each time a task is completed, err will be not null if the task has thrown an error.
queue.concurrency
Property that returns the number of concurrent tasks that could be executed in parallel. It can be altered at runtime.
queue.paused
Property (Read-Only) that returns true when the queue is in a paused state.
queue.drain
Function that will be called when the last item from the queue has been processed by a worker. It can be altered at runtime.
queue.empty
Function that will be called when the last item from the queue has been assigned to a worker. It can be altered at runtime.
queue.saturated
Function that will be called when the queue hits the concurrency limit. It can be altered at runtime.
fastqueue.promise([that], worker(arg), concurrency)
Creates a new queue with Promise apis. It also offers all the methods
and properties of the object returned by fastqueue with the modified
push and unshift methods.
Node v10+ is required to use the promisified version.
Arguments:
that, optional context of theworkerfunction.worker, worker function, it would be called withthatasthis, if that is specified. It MUST return aPromise.concurrency, number of concurrent tasks that could be executed in parallel.
queue.push(task) => Promise
Add a task at the end of the queue. The returned Promise will be fulfilled (rejected)
when the task is completed successfully (unsuccessfully).
This promise could be ignored as it will not lead to a 'unhandledRejection'.
queue.unshift(task) => Promise
Add a task at the beginning of the queue. The returned Promise will be fulfilled (rejected)
when the task is completed successfully (unsuccessfully).
This promise could be ignored as it will not lead to a 'unhandledRejection'.
queue.drained() => Promise
Wait for the queue to be drained. The returned Promise will be resolved when all tasks in the queue have been processed by a worker.
This promise could be ignored as it will not lead to a 'unhandledRejection'.
License
ISC
