ParquetSchemaDefinitionCache
Extends
LRUCache
<ParquetSchemaDefinition
<any
>,ParquetSchema
<any
>>
Implements
Disposable
Constructors
new ParquetSchemaDefinitionCache()
new ParquetSchemaDefinitionCache(max): ParquetSchemaDefinitionCache
Parameters
Parameter | Type | Default value |
---|---|---|
max | number | 1000 |
Returns
Overrides
LRUCache<ParquetSchemaDefinition<any>, ParquetSchema<any>>.constructor
Properties
[toStringTag]
[toStringTag]: string;
A String value that is used in the creation of the default string
description of an object. Called by the built-in method
Object.prototype.toString
.
Inherited from
LRUCache.[toStringTag]
allowStale
allowStale: boolean;
LRUCache.OptionsBase.allowStale
Inherited from
LRUCache.allowStale
allowStaleOnFetchAbort
allowStaleOnFetchAbort: boolean;
LRUCache.OptionsBase.allowStaleOnFetchAbort
Inherited from
LRUCache.allowStaleOnFetchAbort
allowStaleOnFetchRejection
allowStaleOnFetchRejection: boolean;
LRUCache.OptionsBase.allowStaleOnFetchRejection
Inherited from
LRUCache.allowStaleOnFetchRejection
ignoreFetchAbort
ignoreFetchAbort: boolean;
LRUCache.OptionsBase.ignoreFetchAbort
Inherited from
LRUCache.ignoreFetchAbort
maxEntrySize
maxEntrySize: number;
LRUCache.OptionsBase.maxEntrySize
Inherited from
LRUCache.maxEntrySize
noDeleteOnFetchRejection
noDeleteOnFetchRejection: boolean;
LRUCache.OptionsBase.noDeleteOnFetchRejection
Inherited from
LRUCache.noDeleteOnFetchRejection
noDeleteOnStaleGet
noDeleteOnStaleGet: boolean;
LRUCache.OptionsBase.noDeleteOnStaleGet
Inherited from
LRUCache.noDeleteOnStaleGet
noDisposeOnSet
noDisposeOnSet: boolean;
LRUCache.OptionsBase.noDisposeOnSet
Inherited from
LRUCache.noDisposeOnSet
noUpdateTTL
noUpdateTTL: boolean;
LRUCache.OptionsBase.noUpdateTTL
Inherited from
LRUCache.noUpdateTTL
sizeCalculation?
optional sizeCalculation: SizeCalculator<ParquetSchemaDefinition<any>, ParquetSchema<any>>;
LRUCache.OptionsBase.sizeCalculation
Inherited from
LRUCache.sizeCalculation
ttl
ttl: number;
LRUCache.OptionsBase.ttl
Inherited from
LRUCache.ttl
ttlAutopurge
ttlAutopurge: boolean;
LRUCache.OptionsBase.ttlAutopurge
Inherited from
LRUCache.ttlAutopurge
ttlResolution
ttlResolution: number;
LRUCache.OptionsBase.ttlResolution
Inherited from
LRUCache.ttlResolution
updateAgeOnGet
updateAgeOnGet: boolean;
LRUCache.OptionsBase.updateAgeOnGet
Inherited from
LRUCache.updateAgeOnGet
updateAgeOnHas
updateAgeOnHas: boolean;
LRUCache.OptionsBase.updateAgeOnHas
Inherited from
LRUCache.updateAgeOnHas
Accessors
calculatedSize
get calculatedSize(): number
The total computed size of items in the cache (read-only)
Returns
number
Inherited from
LRUCache.calculatedSize
dispose
get dispose(): undefined | Disposer<K, V>
LRUCache.OptionsBase.dispose (read-only)
Returns
undefined
| Disposer
<K
, V
>
Inherited from
LRUCache.dispose
disposeAfter
get disposeAfter(): undefined | Disposer<K, V>
LRUCache.OptionsBase.disposeAfter (read-only)
Returns
undefined
| Disposer
<K
, V
>
Inherited from
LRUCache.disposeAfter
fetchMethod
get fetchMethod(): undefined | Fetcher<K, V, FC>
LRUCache.OptionsBase.fetchMethod (read-only)
Returns
undefined
| Fetcher
<K
, V
, FC
>
Inherited from
LRUCache.fetchMethod
max
get max(): number
LRUCache.OptionsBase.max (read-only)
Returns
number
Inherited from
LRUCache.max
maxSize
get maxSize(): number
LRUCache.OptionsBase.maxSize (read-only)
Returns
number
Inherited from
LRUCache.maxSize
memoMethod
get memoMethod(): undefined | Memoizer<K, V, FC>
Returns
undefined
| Memoizer
<K
, V
, FC
>
Inherited from
LRUCache.memoMethod
size
get size(): number
The number of items stored in the cache (read-only)
Returns
number
Inherited from
LRUCache.size
Methods
[dispose]()
dispose: Promise<void>
Returns
Promise
<void
>
Implementation of
Disposable.[dispose]
[iterator]()
iterator: Generator<[ParquetSchemaDefinition<any>, ParquetSchema<any>], void, unknown>
Iterating over the cache itself yields the same results as LRUCache.entries
Returns
Generator
<[ParquetSchemaDefinition
<any
>, ParquetSchema
<any
>], void
, unknown
>
Inherited from
LRUCache.[iterator]
clear()
clear(): void
Clear the cache entirely, throwing away all values.
Returns
void
Inherited from
LRUCache.clear
delete()
delete(k): boolean
Deletes a key out of the cache.
Returns true if the key was deleted, false otherwise.
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
Returns
boolean
Inherited from
LRUCache.delete
dump()
dump(): [ParquetSchemaDefinition<any>, Entry<ParquetSchema<any>>][]
Return an array of [key, LRUCache.Entry] tuples which can be passed to LRLUCache#load.
The start
fields are calculated relative to a portable Date.now()
timestamp, even if performance.now()
is available.
Stale entries are always included in the dump
, even if
LRUCache.OptionsBase.allowStale is false.
Note: this returns an actual array, not a generator, so it can be more easily passed around.
Returns
[ParquetSchemaDefinition
<any
>, Entry
<ParquetSchema
<any
>>][]
Inherited from
LRUCache.dump
entries()
entries(): Generator<[ParquetSchemaDefinition<any>, ParquetSchema<any>], void, unknown>
Return a generator yielding [key, value]
pairs,
in order from most recently used to least recently used.
Returns
Generator
<[ParquetSchemaDefinition
<any
>, ParquetSchema
<any
>], void
, unknown
>
Inherited from
LRUCache.entries
fetch()
fetch(k, fetchOptions)
fetch(k, fetchOptions): Promise<undefined | ParquetSchema<any>>
Make an asynchronous cached fetch using the LRUCache.OptionsBase.fetchMethod function.
If the value is in the cache and not stale, then the returned Promise resolves to the value.
If not in the cache, or beyond its TTL staleness, then
fetchMethod(key, staleValue, { options, signal, context })
is
called, and the value returned will be added to the cache once
resolved.
If called with allowStale
, and an asynchronous fetch is
currently in progress to reload a stale value, then the former
stale value will be returned.
If called with forceRefresh
, then the cached item will be
re-fetched, even if it is not stale. However, if allowStale
is also
set, then the old value will still be returned. This is useful
in cases where you want to force a reload of a cached value. If
a background fetch is already in progress, then forceRefresh
has no effect.
If multiple fetches for the same key are issued, then they will all be coalesced into a single call to fetchMethod.
Note that this means that handling options such as LRUCache.OptionsBase.allowStaleOnFetchAbort, LRUCache.FetchOptions.signal, and LRUCache.OptionsBase.allowStaleOnFetchRejection will be determined by the FIRST fetch() call for a given key.
This is a known (fixable) shortcoming which will be addresed on when someone complains about it, as the fix would involve added complexity and may not be worth the costs for this edge case.
If LRUCache.OptionsBase.fetchMethod is not specified, then this is
effectively an alias for Promise.resolve(cache.get(key))
.
When the fetch method resolves to a value, if the fetch has not been aborted due to deletion, eviction, or being overwritten, then it is added to the cache using the options provided.
If the key is evicted or deleted before the fetchMethod
resolves, then the AbortSignal passed to the fetchMethod
will
receive an abort
event, and the promise returned by fetch()
will reject with the reason for the abort.
If a signal
is passed to the fetch()
call, then aborting the
signal will abort the fetch and cause the fetch()
promise to
reject with the reason provided.
Setting context
If an FC
type is set to a type other than unknown
, void
, or
undefined
in the LRUCache constructor, then all
calls to cache.fetch()
must provide a context
option. If
set to undefined
or void
, then calls to fetch must not
provide a context
option.
The context
param allows you to provide arbitrary data that
might be relevant in the course of fetching the data. It is only
relevant for the course of a single fetch()
operation, and
discarded afterwards.
Note: fetch()
calls are inflight-unique
If you call fetch()
multiple times with the same key value,
then every call after the first will resolve on the same
promise1,
even if they have different settings that would otherwise change
the behavior of the fetch, such as noDeleteOnFetchRejection
or ignoreFetchAbort
.
In most cases, this is not a problem (in fact, only fetching something once is what you probably want, if you're caching in the first place). If you are changing the fetch() options dramatically between runs, there's a good chance that you might be trying to fit divergent semantics into a single object, and would be better off with multiple cache instances.
1: Ie, they're not the "same Promise", but they resolve at the same time, because they're both waiting on the same underlying fetchMethod response.
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
fetchOptions | FetchOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
Promise
<undefined
| ParquetSchema
<any
>>
Inherited from
LRUCache.fetch
fetch(k, fetchOptions)
fetch(k, fetchOptions?): Promise<undefined | ParquetSchema<any>>
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
fetchOptions ? | FetchOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
Promise
<undefined
| ParquetSchema
<any
>>
Inherited from
LRUCache.fetch
find()
find(fn, getOptions?): undefined | ParquetSchema<any>
Find a value for which the supplied fn method returns a truthy value,
similar to Array.find()
. fn is called as fn(value, key, cache)
.
Parameters
Parameter | Type |
---|---|
fn | (v , k , self ) => boolean |
getOptions ? | GetOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
undefined
| ParquetSchema
<any
>
Inherited from
LRUCache.find
findOrCreateSchema()
findOrCreateSchema<T>(schemaDef): ParquetSchema<T>
Type Parameters
Type Parameter |
---|
T extends ParquetRecordLike |
Parameters
Parameter | Type |
---|---|
schemaDef | ParquetSchemaDefinition <T > |
Returns
forceFetch()
forceFetch(k, fetchOptions)
forceFetch(k, fetchOptions): Promise<ParquetSchema<any>>
In some cases, cache.fetch()
may resolve to undefined
, either because
a LRUCache.OptionsBase#fetchMethod was not provided (turning
cache.fetch(k)
into just an async wrapper around cache.get(k)
) or
because ignoreFetchAbort
was specified (either to the constructor or
in the LRUCache.FetchOptions). Also, the
OptionsBase.fetchMethod may return undefined
or void
, making
the test even more complicated.
Because inferring the cases where undefined
might be returned are so
cumbersome, but testing for undefined
can also be annoying, this method
can be used, which will reject if this.fetch()
resolves to undefined.
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
fetchOptions | FetchOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
Promise
<ParquetSchema
<any
>>
Inherited from
LRUCache.forceFetch
forceFetch(k, fetchOptions)
forceFetch(k, fetchOptions?): Promise<ParquetSchema<any>>
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
fetchOptions ? | FetchOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
Promise
<ParquetSchema
<any
>>
Inherited from
LRUCache.forceFetch
forEach()
forEach(fn, thisp?): void
Call the supplied function on each item in the cache, in order from most recently used to least recently used.
fn
is called as fn(value, key, cache)
.
If thisp
is provided, function will be called in the this
-context of
the provided object, or the cache if no thisp
object is provided.
Does not update age or recenty of use, or iterate over stale values.
Parameters
Parameter | Type |
---|---|
fn | (v , k , self ) => any |
thisp ? | any |
Returns
void
Inherited from
LRUCache.forEach
get()
get(k, getOptions?): undefined | ParquetSchema<any>
Return a value from the cache. Will update the recency of the cache entry found.
If the key is not found, get() will return undefined
.
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
getOptions ? | GetOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
undefined
| ParquetSchema
<any
>
Inherited from
LRUCache.get
getRemainingTTL()
getRemainingTTL(key): number
Return the number of ms left in the item's TTL. If item is not in cache,
returns 0
. Returns Infinity
if item is in cache without a defined TTL.
Parameters
Parameter | Type |
---|---|
key | ParquetSchemaDefinition <any > |
Returns
number
Inherited from
LRUCache.getRemainingTTL
has()
has(k, hasOptions?): boolean
Check if a key is in the cache, without updating the recency of use. Will return false if the item is stale, even though it is technically in the cache.
Check if a key is in the cache, without updating the recency of
use. Age is updated if LRUCache.OptionsBase.updateAgeOnHas is set
to true
in either the options or the constructor.
Will return false
if the item is stale, even though it is technically in
the cache. The difference can be determined (if it matters) by using a
status
argument, and inspecting the has
field.
Will not update item age unless LRUCache.OptionsBase.updateAgeOnHas is set.
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
hasOptions ? | HasOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
boolean
Inherited from
LRUCache.has
info()
info(key): undefined | Entry<ParquetSchema<any>>
Get the extended info about a given entry, to get its value, size, and
TTL info simultaneously. Returns undefined
if the key is not present.
Unlike LRUCache#dump, which is designed to be portable and survive
serialization, the start
value is always the current timestamp, and the
ttl
is a calculated remaining time to live (negative if expired).
Always returns stale values, if their info is found in the cache, so be sure to check for expirations (ie, a negative LRUCache.Entry#ttl) if relevant.
Parameters
Parameter | Type |
---|---|
key | ParquetSchemaDefinition <any > |
Returns
undefined
| Entry
<ParquetSchema
<any
>>
Inherited from
LRUCache.info
keys()
keys(): Generator<ParquetSchemaDefinition<any>, void, unknown>
Return a generator yielding the keys in the cache, in order from most recently used to least recently used.
Returns
Generator
<ParquetSchemaDefinition
<any
>, void
, unknown
>
Inherited from
LRUCache.keys
load()
load(arr): void
Reset the cache and load in the items in entries in the order listed.
The shape of the resulting cache may be different if the same options are not used in both caches.
The start
fields are assumed to be calculated relative to a portable
Date.now()
timestamp, even if performance.now()
is available.
Parameters
Parameter | Type |
---|---|
arr | [ParquetSchemaDefinition <any >, Entry <ParquetSchema <any >>][] |
Returns
void
Inherited from
LRUCache.load
memo()
memo(k, memoOptions)
memo(k, memoOptions): ParquetSchema<any>
If the key is found in the cache, then this is equivalent to LRUCache#get. If not, in the cache, then calculate the value using the LRUCache.OptionsBase.memoMethod, and add it to the cache.
If an FC
type is set to a type other than unknown
, void
, or
undefined
in the LRUCache constructor, then all calls to cache.memo()
must provide a context
option. If set to undefined
or void
, then
calls to memo must not provide a context
option.
The context
param allows you to provide arbitrary data that might be
relevant in the course of fetching the data. It is only relevant for the
course of a single memo()
operation, and discarded afterwards.
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
memoOptions | MemoOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
ParquetSchema
<any
>
Inherited from
LRUCache.memo
memo(k, memoOptions)
memo(k, memoOptions?): ParquetSchema<any>
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
memoOptions ? | MemoOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
ParquetSchema
<any
>
Inherited from
LRUCache.memo
peek()
peek(k, peekOptions?): undefined | ParquetSchema<any>
Like LRUCache#get but doesn't update recency or delete stale items.
Returns undefined
if the item is stale, unless
LRUCache.OptionsBase.allowStale is set.
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
peekOptions ? | PeekOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
undefined
| ParquetSchema
<any
>
Inherited from
LRUCache.peek
pop()
pop(): undefined | ParquetSchema<any>
Evict the least recently used item, returning its value or
undefined
if cache is empty.
Returns
undefined
| ParquetSchema
<any
>
Inherited from
LRUCache.pop
purgeStale()
purgeStale(): boolean
Delete any stale entries. Returns true if anything was removed, false otherwise.
Returns
boolean
Inherited from
LRUCache.purgeStale
rentries()
rentries(): Generator<(undefined | ParquetSchemaDefinition<any> | ParquetSchema<any> | BackgroundFetch<ParquetSchema<any>>)[], void, unknown>
Inverse order version of LRUCache.entries
Return a generator yielding [key, value]
pairs,
in order from least recently used to most recently used.
Returns
Generator
<(undefined
| ParquetSchemaDefinition
<any
> | ParquetSchema
<any
> | BackgroundFetch
<ParquetSchema
<any
>>)[], void
, unknown
>
Inherited from
LRUCache.rentries
rforEach()
rforEach(fn, thisp?): void
The same as LRUCache.forEach but items are iterated over in reverse order. (ie, less recently used items are iterated over first.)
Parameters
Parameter | Type |
---|---|
fn | (v , k , self ) => any |
thisp ? | any |
Returns
void
Inherited from
LRUCache.rforEach
rkeys()
rkeys(): Generator<ParquetSchemaDefinition<any>, void, unknown>
Inverse order version of LRUCache.keys
Return a generator yielding the keys in the cache, in order from least recently used to most recently used.
Returns
Generator
<ParquetSchemaDefinition
<any
>, void
, unknown
>
Inherited from
LRUCache.rkeys
rvalues()
rvalues(): Generator<undefined | ParquetSchema<any> | BackgroundFetch<ParquetSchema<any>>, void, unknown>
Inverse order version of LRUCache.values
Return a generator yielding the values in the cache, in order from least recently used to most recently used.
Returns
Generator
<undefined
| ParquetSchema
<any
> | BackgroundFetch
<ParquetSchema
<any
>>, void
, unknown
>
Inherited from
LRUCache.rvalues
set()
set(
k,
v,
setOptions?): this
Add a value to the cache.
Note: if undefined
is specified as a value, this is an alias for
LRUCache#delete
Fields on the LRUCache.SetOptions options param will override
their corresponding values in the constructor options for the scope
of this single set()
operation.
If start
is provided, then that will set the effective start
time for the TTL calculation. Note that this must be a previous
value of performance.now()
if supported, or a previous value of
Date.now()
if not.
Options object may also include size
, which will prevent
calling the sizeCalculation
function and just use the specified
number if it is a positive integer, and noDisposeOnSet
which
will prevent calling a dispose
function in the case of
overwrites.
If the size
(or return value of sizeCalculation
) for a given
entry is greater than maxEntrySize
, then the item will not be
added to the cache.
Will update the recency of the entry.
If the value is undefined
, then this is an alias for
cache.delete(key)
. undefined
is never stored in the cache.
Parameters
Parameter | Type |
---|---|
k | ParquetSchemaDefinition <any > |
v | undefined | ParquetSchema <any > | BackgroundFetch <ParquetSchema <any >> |
setOptions ? | SetOptions <ParquetSchemaDefinition <any >, ParquetSchema <any >, unknown > |
Returns
this
Inherited from
LRUCache.set
values()
values(): Generator<ParquetSchema<any>, void, unknown>
Return a generator yielding the values in the cache, in order from most recently used to least recently used.
Returns
Generator
<ParquetSchema
<any
>, void
, unknown
>
Inherited from
LRUCache.values
unsafeExposeInternals()
static unsafeExposeInternals<K, V, FC>(c): object
Do not call this method unless you need to inspect the inner workings of the cache. If anything returned by this object is modified in any way, strange breakage may occur.
These fields are private for a reason!
Type Parameters
Type Parameter | Default type |
---|---|
K extends object | - |
V extends object | - |
FC extends unknown | unknown |
Parameters
Parameter | Type |
---|---|
c | LRUCache <K , V , FC > |
Returns
object
backgroundFetch()
backgroundFetch: (k, index, options, context) => BackgroundFetch<V>;
Parameters
Parameter | Type |
---|---|
k | K |
index | undefined | number |
options | FetchOptions <K , V , FC > |
context | any |
Returns
BackgroundFetch
<V
>
free
free: StackLike;
head
readonly head: Index;
indexes()
indexes: (options?) => Generator<Index, void, unknown>;
Parameters
Parameter | Type |
---|---|
options ? | object |
options.allowStale ? | boolean |
Returns
Generator
<Index
, void
, unknown
>
isBackgroundFetch()
isBackgroundFetch: (p) => boolean;
Parameters
Parameter | Type |
---|---|
p | any |
Returns
boolean
isStale()
isStale: (index) => boolean;
Parameters
Parameter | Type |
---|---|
index | undefined | number |
Returns
boolean
keyList
keyList: (undefined | K)[];
keyMap
keyMap: Map<K, number>;
moveToTail()
moveToTail: (index) => void;
Parameters
Parameter | Type |
---|---|
index | number |
Returns
void
next
next: NumberArray;
prev
prev: NumberArray;
rindexes()
rindexes: (options?) => Generator<Index, void, unknown>;
Parameters
Parameter | Type |
---|---|
options ? | object |
options.allowStale ? | boolean |
Returns
Generator
<Index
, void
, unknown
>
sizes
sizes: undefined | ZeroArray;
starts
starts: undefined | ZeroArray;
tail
readonly tail: Index;
ttls
ttls: undefined | ZeroArray;
valList
valList: (undefined | V | BackgroundFetch<V>)[];
Inherited from
LRUCache.unsafeExposeInternals