Module tf.cheatsheet

A. Advanced API

Initialisation, configuration, meta data, and linking

A = use('org/repo')

start up and load a corpus from a repository and deliver its API.

See tf.about.usefunc

A.load(features)

Load an extra bunch of features.

App.load()

A.showContext(...)

show app settings

showContext()

A.header()

show colofon

header()

A.showProvenance(...)

show provenance of code and data

showProvenance()

A.webLink(n, ...)

hyperlink to node n on the web

webLink()

A.isLoaded(features=None)

Show information about loaded features

Api.isLoaded()


Displaying

A.specialCharacters()

show all hard-to-type characters in the corpus in a widget

specialCharacters()

A.showFormats()

show all text formats and their definitions

showFormats()

A.dm(markdownString)

display markdown string in notebook

dm()

A.dh(htmlString)

display HTML string in notebook

dh()

A.method(option1=value1, option2=value2, ...)

Many of the following methods accept these options as keyword arguments:

tf.advanced.options

A.displayShow(...)

show display options

displayShow()

A.displayReset(...)

reset display options

displayReset()

A.displaySetup(...)

set up display options

displaySetup()

A.table(results, ...)

plain rendering of tuple of tuples of node

table()

A.plainTuple(tup, ith, ...)

plain rendering of tuple of node

plainTuple()

A.plain(node, ...)

plain rendering of node

plain()

A.show(results, ...)

pretty rendering of tuple of tuples of node

show()

A.prettyTuple(tup, ith, ...)

pretty rendering of tuple of node

prettyTuple()

A.pretty(node, ...)

pretty rendering of node

pretty()

A.unravel(node, ...)

convert a graph to a tree

unravel()

A.getCss()

get the complete CSS stylesheet for this app

getCss()


Search (high level)

A.search(...)

search, collect and deliver results, report number of results

search()


Sections and Structure

A.nodeFromSectionStr(...)

lookup node for sectionheading

nodeFromSectionStr()

A.sectionStrFromNode(...)

lookup section heading for node

sectionStrFromNode()

A.structureStrFromNode(...)

lookup structure heading for node

structureStrFromNode()


Volumes and collections

See also tf.about.volumes.

A.extract(volumes, ...)

export volumes based on a volume specification

Fabric.extract()

A.collect(volumes, ...)

collect several volumes into a new collection

export()

Fabric.collect()


Export to Excel

A.export(results, ...)

export formatted data

export()


Logging

A.dm(markdownString)

display markdown string in notebook

dm()

A.dh(htmlString)

display HTML string in notebook

dh()

A.version

version number of data of the corpus.

Fabric.version

The following methods work also for TF. instead of A.:

A.banner

banner of the Text-Fabric program.

Fabric.banner

A.isSilent()

report the verbosity of Text-Fabric

Timestamp.isSilent()

A.silentOn(deep=False)

make TF silent from now on.

Timestamp.silentOn()

A.silentOff()

make TF talkative from now on.

Timestamp.silentOff()

A.setSilent(silent)

set the verbosity of Text-Fabric.

Timestamp.setSilent()

A.indent(level=None, reset=False)

Sets up indentation and timing of following messages

Timestamp.indent()

A.info(msg, tm=True, nl=True, ...)

informational message

Timestamp.info()

A.warning(msg, tm=True, nl=True, ...)

warning message

Timestamp.warning()

A.error(msg, tm=True, nl=True, ...)

error message

Timestamp.error()


N. F. E. L. T. S. C. Core API

N. Nodes

Read about the canonical ordering here: tf.core.nodes.

N.walk()

generator of all nodes in canonical ordering

Nodes.walk()

N.sortNodes(nodes)

sorts nodes in the canonical ordering

Nodes.sortNodes()

N.otypeRank[nodeType]

ranking position of nodeType

Nodes.otypeRank

N.sortKey(node)

defines the canonical ordering on nodes

Nodes.sortKey

N.sortKeyTuple(tup)

extends the canonical ordering on nodes to tuples of nodes

Nodes.sortKeyTuple

N.sortKeyChunk(node)

defines the canonical ordering on node chunks

Nodes.sortKeyChunk


F. Node features

Fall()

all loaded feature names (node features only)

Api.Fall()

F.fff.v(node)

get value of node feature fff

NodeFeature.v()

F.fff.s(value)

get nodes where feature fff has value

NodeFeature.s()

F.fff.freqList(...)

frequency list of values of fff

NodeFeature.freqList()

F.fff.items(...)

generator of all entries of fff as mapping from nodes to values

NodeFeature.items()

F.fff.meta

meta data of feature fff

NodeFeature.meta

Fs('fff')

identical to F.ffff, usable if name of feature is variable

Api.Fs()


Special node feature otype

Maps nodes to their types.

F.otype.v(node)

get type of node

OtypeFeature.v()

F.otype.s(nodeType)

get all nodes of type nodeType

OtypeFeature.s()

F.otype.sInterval(nodeType)

gives start and ending nodes of nodeType

OtypeFeature.sInterval()

F.otype.items(...)

generator of all (node, type) pairs.

OtypeFeature.items()

F.otype.meta

meta data of feature otype

OtypeFeature.meta

F.otype.maxSlot

the last slot node

OtypeFeature.maxSlot

F.otype.maxNode

the last node

OtypeFeature.maxNode

F.otype.slotType

the slot type

OtypeFeature.slotType

F.otype.all

sorted list of all node types

OtypeFeature.all


E. Edge features

Eall()

all loaded feature names (edge features only)

Api.Eall()

E.fff.f(node)

get value of feature fff for edges from node

EdgeFeature.f()

E.fff.t(node)

get value of feature fff for edges to node

EdgeFeature.t()

E.fff.freqList(...)

frequency list of values of fff

EdgeFeature.freqList()

E.fff.items(...)

generator of all entries of fff as mapping from edges to values

EdgeFeature.items()

E.fff.b(node)

get value of feature fff for edges from and to node

EdgeFeature.b()

E.fff.meta

all meta data of feature fff

EdgeFeature.meta

Es('fff')

identical to E.fff, usable if name of feature is variable

Api.Es()


Special edge feature oslots

Maps nodes to the set of slots they occupy.

E.oslots.items(...)

generator of all entries of oslots as mapping from nodes to sets of slots

OslotsFeature.items()

E.oslots.s(node)

set of slots linked to node

OslotsFeature.s()

E.oslots.meta

all meta data of feature oslots

OslotsFeature.meta


L. Locality

L.i(node, otype=...)

go to intersecting nodes

Locality.i()

L.u(node, otype=...)

go one level up

Locality.u()

L.d(node, otype=...)

go one level down

Locality.d()

L.p(node, otype=...)

go to adjacent previous nodes

Locality.p()

L.n(node, otype=...)

go to adjacent next nodes

Locality.n()


T. Text

T.text(node, fmt=..., ...)

give formatted text associated with node

Text.text()


Sections

Rigid 1 or 2 or 3 sectioning system

T.sectionTuple(node)

give tuple of section nodes that contain node

Text.sectionTuple()

T.sectionFromNode(node)

give section heading of node

Text.sectionFromNode()

T.nodeFromSection(section)

give node for section heading

Text.nodeFromSection()


Structure

Flexible multilevel sectioning system

T.headingFromNode(node)

give structure heading of node

Text.headingFromNode()

T.nodeFromHeading(heading)

give node for structure heading

Text.nodeFromHeading()

T.structureInfo()

give summary of dataset structure

Text.structureInfo()

T.structure(node)

give structure of node and all in it.

Text.structure()

T.structurePretty(node)

pretty print structure of node and all in it.

Text.structurePretty()

T.top()

give all top-level structural nodes in the dataset

Text.top()

T.up(node)

gives parent of structural node

Text.up()

T.down(node)

gives children of structural node

Text.down()


S. Search (low level)

searchRough

Preparation

S.search(query, limit=None)

Query the TF dataset with a template

Search.search()

S.study(query, ...)

Study the query in order to set up a plan

Search.study()

S.showPlan(details=False)

Show the search plan resulting from the last study.

Search.showPlan()

S.relationsLegend()

Catalog of all relational devices in search templates

Search.relationsLegend()


Fetching results

S.count(progress=None, limit=None)

Count the results, up to a limit

Search.count()

S.fetch(limit=None, ...)

Fetches the results, up to a limit

Search.fetch()

S.glean(tup)

Renders a single result into something human readable.

Search.glean()


Implementation

S.tweakPerformance(...)

Set certain parameters that influence the performance of search.

Search.tweakPerformance()


C. Computed data components.

Access to precomputed data: Computeds.

All components have just one useful attribute: .data.

Call()

all precomputed data component names

Api.Call()

Cs('ccc')

identical to C.ccc, usable if name of component is variable

Api.Cs()

C.levels.data

various statistics on node types

levels()

C.order.data

the canonical order of the nodes (tf.core.nodes)

order()

C.rank.data

the rank of the nodes in the canonical order (tf.core.nodes)

rank()

C.levUp.data

feeds the Locality.u() function

levUp()

C.levDown.data

feeds the Locality.d() function

levDown()

C.boundary.data

feeds the Locality.p() and Locality.n() functions

boundary()

C.characters.data

frequency list of characters in a corpus, separately for all the text formats

characters()

C.sections.data

feeds the section part of tf.core.text

sections()

C.structure.data

feeds the structure part of tf.core.text

structure()


TF. Dataset

Loading

TF = Fabric(locations=dirs, modules=subdirs, volume=None, collection=None, silent=False)

Initialize API on work or single volume or collection of a work from explicit directories. Use use() instead wherever you can. See also tf.about.volumes.

Fabric

TF.isLoaded(features=None)

Show information about loaded features

Api.isLoaded()

TF.explore(show=True)

Get features by category, loaded or unloaded

FabricCore.explore()

TF.loadAll(silent=None)

Load all loadable features.

FabricCore.loadAll()

TF.load(features, add=False)

Load a bunch of features from scratch or additionally.

FabricCore.load()

TF.ensureLoaded(features)

Make sure that features are loaded.

Api.ensureLoaded()

TF.makeAvailableIn(globals())

Make the members of the core API available in the global scope

Api.makeAvailableIn()

TF.ignored

Which features have been overridden.

Api.ignored


Volumes

See also tf.about.volumes.

TF.extract(volumes, ...)

export volumes based on a volume specification

Fabric.extract()

TF.collect(volumes, ...)

collect several volumes into a new collection

export()

Fabric.collect()

Saving

TF.save(nodeFeatures={}, edgeFeatures={}, metaData={},,...)

Save a bunch of newly generated features to disk.

FabricCore.save()


House keeping

TF.version

version number of Text-Fabric.

Fabric.version

TF.clearCache()

clears the cache of compiled TF data

FabricCore.clearCache()

from tf.clean import clean
clean()

clears the cache of compiled TF data

tf.clean


Volume support

TF datasets per volume or collection of a work. See also tf.about.volumes.

from tf.volumes import collect

collect(volumes, work, ...)

Collects several volumes into a new collection

tf.volumes.collect

from tf.volumes import extract

extract(work, volumes, ...)

Extracts volumes from a work

tf.volumes.extract


Dataset Operations

from tf.dataset import modify

modify(source, target, ...)

Modifies a TF datasets into one new TF dataset

tf.dataset.modify

from tf.dataset import Versions

Versions(api, va, vb, slotMap)

Extends a slot mapping between versions of a TF dataset to a complete node mapping

tf.dataset.nodemaps


Data Interchange

from tf.lib import readSets
from tf.lib import writeSets
readSets(sourceFile)

reads a named sets from file

readSets()

writeSets(sets, destFile)

writes a named sets to file

writeSets()


Export to Excel

A.export(results, ...)

export formatted data

export()


Interchange with external annotation tools

from convert.recorder import Recorder
Recorder()

generate annotatable plain text and import annotations

tf.convert.recorder


MQL interchange

TF.exportMQL()

export loaded dataset to MQL

FabricCore.exportMQL()

TF.importMQL()

convert MQL file to TF dataset

FabricCore.importMQL()


Walker conversion

from tf.convert.walker import CV
cv = CV(TF)

convert structured data to TF dataset

tf.convert.walker


Exploding

from tf.convert.tf import explode
explode(inLocation, outLocation)

explode TF feature files to straight data files without optimizations

explode()


TF-App development

A.reuse()

reload config data

App.reuse()

from tf.advanced.find import loadModule
mmm = loadModule("mmm", *args)

load supporting TF-app specific module

loadModule()

~/mypath/myname/app/config.yaml

settings for a TF-App

tf.advanced.settings

Layered search

(these are commandline instructions)

text-fabric-make {dataset} {client} ship

generate a static site with a search interface in client-side javascript and publish it to GitHub pages. If {client} is left out, generate all clients that are defined for this dataset. Clients are defined in the app-{dataset} repo, under layeredsearch.

text-fabric-make {dataset} serve

serve the search interfaces defined for {dataset} locally.

More commands here.

Expand source code Browse git
"""
.. include:: docs/main/cheatsheet.md
"""