next up previous contents
Next: Ma-Mm Up: Linux Software Encyclopedia Previous: La-Lm   Contents

Ln-Lz

Last checked or modified: Aug. 23, 2000

[home / linux ]


CATEGORIES | NEW
Aa-Am | An-Az | Ba-Bm | Bn-Bz | Ca-Cm | Cn-Cz | Da-Dm | Dn-Dz | Ea-Em | En-Ez | Fa-Fm | Fn-Fz | Ga-Gm | Gn-Gz | Ha-Hm | Hn-Hz | Ia-Im | In-Iz | Ja-Jm | Jn-Jz | Ka-Km | Kn-Kz | La-Lm | Ln-Lz | Ma-Mm | Mn-Mz | Na-Nm | Nn-Nz | Oa-Om | On-Oz | Pa-Pm | Pn-Pz | Qa-Qm | Qn-Qz | Ra-Rm | Rn-Rz | Sa-Sm | Sn-Sz | Ta-Tm | Tn-Tz | Ua-Um | Un-Uz | Va-Vm | Vn-Vz | Wa-Wm | Wn-Wz | Xa-Xm | Xn-Xz | Ya-Ym | Yn-Yz | Za-Zm | Zn-Zz |


LMITOOL
The Linear Matrix Inequality TOOLkit is a package for LMI optimization that acts as an interface to the SP package. An LMI optimization problem is one in which matrix variables are subject to equality and positive-definiteness constraints and the objective is a linear function of these variables.

LMITOOL is available in the form of a Matlab toolbox. The distribution contains the Matlab source code, a user's guide, and a set of solved control problems. The package is also available as a built-in library in Scilab.

[http://www.ensta.fr/uer/uma/gropco/lmi/lmitool.html]

L-moments
A library of routines for statistical analysis using L-moments and some auxiliary routines used by the L-moment routines. L-moments are measures of location, scale and shape of probability distributions, similar to the ordinary moments but estimable from linear combinations of order statistics. The following routines are provided for each of seven distributions:
  • CDFxxx, the cumulative distribution function;
  • QUAxxx, the quantile function (inverse cumulative);
  • LMRxxx, calculates the L-moment ratios of the distribution given its parameters; and
  • PELxxx, calculates the parameters of the distribution given its L-moments.
The available distributions (corresponding to the xxx in each of the above programs) are:
  • GEV, generalized extreme-value;
  • GLO, generalized logistic;
  • GNO, generalized normal (lognormal);
  • GPA, generalized Pareto;
  • GUM, Gumbel;
  • KAP, Kappa; and
  • WAK, Wakeby.
The Fortran source code is available. The documentation is contained within a technical report available in the same directory.

[http://www.stat.cmu.edu/general/]

Loadable Kernel Modules (LKM)
Loadable Kernel Modules are, as the name hints, modules that allow extra functionality to be dynamically included (i.e. loaded) in the Linux kernel. Packages that add functionality via an LKM include:
  • Coda, an advanced networked file system;

[http://www.redhat.com/mirrors/LDP/LDP/lkmpg/mpg.html]
[http://www.infowar.co.uk/thc/files/thc/LKM_HACKING.html]

loadlin
A boot loader for Linux that runs under DOS and can boot Linux from either a DOS prompt for CONFIG.SYS.

[ftp://elserv.ffm.fgan.de/pub/linux/]

LOAF
See under distributions.

LOCFIT
A package for performing local regression and likelihood. The specific capabilities of LOCFIT include:
  • local regression;
  • multivariate local regression with either one-sided or robust smoothing and several dimension reduction methods;
  • local likelihood estimation using linear logistic, Poisson, gamma, or geometric regressions with cross validation;
  • density estimation with length biased sampling, probability contours, and Poisson process rate estimation;
  • survival and failure time analysis including hazard rate estimation, hazard regression, censored regression, and censored local likelihood;
  • discrimination and classification using logistic regression or density estimation for single or multiple classes;
  • goodness of fit including variance estimation, confidence intervals and bands, and nonhomogeous variance;,
  • bandwidth selection including predictive cross-validation, plug-in bandwidth selection, and validating the bandwidth selectors; and
  • adaptive fitting including transformation, local bandwidth choice, variable bandwidth fitting, and adaptive local likelihood.

The LOCFIT package is available as source code in either the S (see R)statistical language or ANSI C, the latter of which compiled with no problems on my Linux box using GCC. The documentation is available in both HTML and (partially) PostScript formats.

[http://cm.bell-labs.com/cm/ms/departments/sia/project/locfit/]

Loci
A network-distributed system of clients and servers designed for scientific data processing. The initial (7/99) focus of the project is to construct GUI wrappers for currently available command-line programs and databases that run on UNIX systems. The scripting language Python is being used as a base language for project development. The present and planned features of Loci include:
  • full use of the client/server paradigm;
  • communication across intra- and inter-nets;
  • bioinformatic collaboratories across intra- and inter-nets;
  • built-in support for sequence and structure analyses of macromolecules;
  • native support for phylogenetics and systematics;
  • a library of basic analysis tools;
  • production of 2-D vector-drawn schematics;
  • treatment of biological data as scientific illustrations;
  • provision of drawing tools and a materials library for figure construction;
  • tracking of the work path with a flow chart;
  • automatic activity logging to an electronic notebook; and
  • seamless integration of utilities for building applications and extending the system.
A initial release of this under the GPL is planned.

[http://theopenlab.org/loci/]
[http://www.bioinformatics.org/piper/]

Locomotive
An Open Source Web application server. This is a middleware server for developing and deploying application services including servlets. This has been used to build into Web sites such services as content management and publishing systems, online storefronts, Web-based email systems, database interfaces and consumer-oriented online services. Requests are received at one or more Web servers which handle requests for static pages and pass the requests for dynamic pages to one or more instantiations of Locomotive. The Locomotives prepare the dynamic documents by extracting information from databases, performing calculations, and integrating the results into HTML templates (using the STEAM template language) to form new pages. These are then returned to the Web servers for display on the requesting browser.

The features supplied by Locomotive include:

  • dynamic and reusable content and content customization;
  • an administration interface providing statistics, security tracking and control;
  • scaling to handle thousands of simultaneous users via multithreading, cached database connections, thread pooling, and fast Servlet reponse;
  • automatic load balancing across multiple machines and instances of the software;
  • high availability via automatic restarting on spare servers to handle load increases or machine or network failures;
  • management of transactions across multiple distributed systems; and
  • a Java Servlet API for building applications and integrating with external systems;
  • the STEAM HTML templating language with a variety of predefined environment variables; and
  • built-in ecommerce capabilities including integration with CyberSource via SCMP.

An Open Source version of Locomotive is available. It requires JDK 1.1.6 or greater and supports Linux servers, several databases including MySQL, and the Apache web server. Documentation includes an installation guide, a developer's manual, a STEAM manual, an administrator's manual and a reference manual describing the internals.

[http://www.locomotive.org/]

locus
A full text database and search engine designed to be:
  • personal but not lightweight, i.e. trading off slower indexing and higher disk usage for larger maximum database size and more focused searching;
  • smart without being hostile to the programmer, i.e. finding words, word combinations, phrases over more than one line with words near each other, etc.; and
  • a universal back-end that can handle any front-end and is supplied with a simple command-line version.
A source code distribution is available as is a Linux binary.

[http://www.locus.cz/locus/]
[http://antarctica.penguincomputing.com/locus/]

LODestar
A Level Of Detail generator that reads VRML 1.0 files and outputs simplified geometry. Level of detail generation is a crucial issue for achieving acceptable frame rates for VR and other 3-D applications. This program reduces the indexed face sets (polygonal data) and indexed line sets that comprise the bulk of VRML data using an algorithm based on octree quantization of vertices. In addition to creating LODs, The tool removes double defined coordinates, materials, normals, texture coordinates, faces and lines within the VRML nodes. Binary versions are available for several platforms including Linux Intel.

[http://www.cg.tuwien.ac.at/research/vr/lodestar/]

Loess
Locally-weighted regression is a procedure for estimating a regression surface using a multivariate smoothing procedure. It involves fitting a linear or quadratic function of the independent variables in a moving fashion analogous to how a moving average is computed for a time series. Loess substantially increases the domain of surfaces that can be estimated without distortion over classical methods such as fitting global parameteric functions. It also has the property that analogues of statistical procedures used in parametric function fitting, e.g. ANOVA and t intervals, involve statistics whose distribution sare well approximated by familiar distributions.

Source code distributions written in both Fortran and C are available for Loess. The use of the code is extensively documented in a 53 page user's guide available in PostScript format. See Cleveland and Grosse (1991).

[http://www.netlib.org/a/index.html]

Logcheck
A security program to process UNIX system logfiles. It automatically spots possible problems and security violations in your log files and sends the results via email. It has built-in support for several systems including Linux.

[http://www.psionic.com/abacus/logcheck/]

logic programming/languages
Logic programming languages are declarative languages whose basis is the relation as opposed to the function underlying functional languages - the other main category of declarative languages. Logic programming is derived from predicate logic. The most well-known logic programming language is PROLOG, of which several implementations are available.

Languages and/or systems that implement some form of logic programming include:

[http://www.comlab.ox.ac.uk/archive/logic-prog.html]
[http://www-i2.informatik.rwth-aachen.de/~hanus/FLP/index.html]
[http://www.logic-programming.org/]

Loglan'82
An object-oriented, universal, imperative programming language. Loglan has four features which distinguish it from other object-oriented languages:
  • multi-level inheritance which offers both nesting and inheritance of modules with the former enabling the sharing of environments and the latter private copies of environments;
  • multi-kind inheritance wherein a procedure can inherit from a class to enforce protocols, dynamically check axioms of abstract data types, etc.;
  • safety wherein Loglan signals a lot of programming errors that pass unrecognized in other systems as well as offers a safe deallocation statement and a safe storage management system; and
  • object-oriented concurrency wherein objects of processes can be created dynamically and allocated on a processor accessible via the network as well as communicate and synchronize via a novel mechanism called alien call.
Loglan also has all of the standard methods of object-oriented programming, e.g. classes and objects of classes, virtual methods, hierarchies of classes, exception handling, etc.

A source code distribution of Loglan is available for UNIX platforms. It is written in C and can be compiled using most available C compilers. A user's manual and various other documents are available in various formats.

[http://www.univ-pau.fr/~salwicki/loghome.html]

logtail
A Perl for dynamically monitoring the entries in any number of log files on one or more machines on a network. Logs that transfer to new files are automatically followed and there is an option for translating numeric IP numbers into the corresponding hostnames. Items can be relayed to one or more other hosts on the network so machines can be monitored either locally or remotely.

[http://www.fourmilab.ch/webtools/logtail/]

LogWatch
A customizable and pluggable log monitoring system that can go through logs for a given period of time and create a detailed report about selected areas. A source code distribution is available as are RPM binaries.

[http://www.kaybee.org/~kirk/html/linux.html]

LoLA
The Library of Location Algorithms contains algorithms for the application area known as location planning. The LoLA package consists of a GUI, a test-based interface, and a programming API designed to enable users to write their own C++ programs using LoLA algorithms. The algorithms provided in LoLA include those for:
  • planar 1-location problems without restrictions;
  • planar 1-location problems with forbidden regions;
  • planar N-location problems;
  • network and graph problems; and
  • the discrete location problem.

[http://www.mathematik.uni-kl.de/~lola/]

Lollipop
A macro package that functions as a toolbox for writing TeX macros. It is an attempt to make structured text formatting available for environments where previously only WYSIWYG packages could be used because adapting the layout is so much easier with them than with traditional TeX macro packages (more about which can be found in the perennial TeX/WYSIWYG flame war taking place on the Net).

[http://tug2.cs.umb.edu/ctan/tex-archive/macros/lollipop/]

LONI
The Laboratory Of Neuro Imaging has a suite of programs to manipulate and display 2-D images (singly and in groups), 3-D image volumes, Region of Interest contours, and triangulated surface models.

The LONI program suite includes:

  • SEG, a segmentation editor that permits the delineation of Regions of Interest in images that can be edited and displayed;
  • FKU, which interpolates a surface model from a file containing 2-D contours drawn into parallel planes;
  • TTA, a multi-axial triangulator which interpolates a surface model from a file containing 2-D contours drawn in planes which have multiple axes;
  • MAUD, a multi-axial drawing tool used to draw non-coplanar contours;
  • Signdist, a program that creates a distance field from a LONI triangle model;
  • LTCL, an extended version of Tcl with the capacity to apply various spatial transformations to 2- and 3-D images, surface models, and contours;
  • TMA, a transformation assistant which provides a sophisticated user interface that simplifies the specification of frequently used volume transformations;
  • LRT, a resampling tool used to browse and resample LONI packed volume image files;
  • Warptool, a program that warps one 2-D image onto a reference image; and
  • AIR, a package of routines that performs automated registration of tomographic images.
There are also programs for converting between image file formats.

The source code for the LONI programs, written in C and C++, is available for all of the programs. The LONI suite also requires the use of Tcl/Tk, mMotif, Perl, and the Netpbm library. Additionally, the Tcl/Tk extensions Itcl, Expect, and BLT are required. The documentation for each program is contained within a man page.

[http://www.loni.ucla.edu/]
[http://www.loni.ucla.edu/resources/loni_code/index.html]

Lore
A database management system for XML. The Lore project focuses on defining a declarative query language for XML, developing new technology for interactive searches over XML data, and building an efficient XML query processor. The features and functionality of Lore include:
  • a query language developed specifically for XML or other semistructured data that provides powerful path-traversal operators and makes extensive use of type coercion to help yield intuitive results for queries of XML data;
  • a cost-based query optimizer that analyzes the search space to determine effective query plans;
  • maintenance of DataGuides, i.e. a structural summary of all paths in a database;
  • dynamically integrating information fetched from one or more external data sources via an external data manager;
  • a technology for ranking database objects based on their proximity to other objects, where proximity is measured based on distances in the graph linking the objects together;
  • a view specification language incorporating techniques such as query rewriting and view materialization for defining data views; and
  • a system called Ozone that allows both structured and semistructured data to exist within the same database.
Binary distributions of Lore are available for Sun OS and Linux platforms.

[http://www-db.stanford.edu/lore/home/]

Lothar
A GUI-based tool for automating and simplifying the processing of detecting and installing new hardware. This allows hardware to be detected or selected from a list as well as the adjustment of IO, IRQ and X86 settings. This uses, amongst other things, the detect hardware detection library.

[http://www.linux-mandrake.com/lothar/]

LOTOS
The Language Of Temporal Ordering Specification is a Formal Description Technique (FDT) developed as an international standard that provides the basis for the unambiguous definition of other standards. This formal description technique based on the temporal ordering of observational behavior is standardized as ISO/IEC 8807. LOTOS was originally based on the formal specification language CCS (Calculus of Communicating Systems), added some notation and concepts from CSP (Communicating Sequential Processes), and incorporated the abstract data type language ACT ONE to add the formal specification of data types. A newer version called Enhanced LOTOS or E-LOTOS is currently (1/99) in the process of being standardized.

LOTOS-related tools include:

[http://www.cs.stir.ac.uk/~kjt/research/well/]

LOTPS
A Fortran 77 program which serves as an interface for a set of subroutines which solve the scattered data interpolation problem. A smooth function passing through the given set of scattered points is constructed. The method used involves the construction of locally defined thin plate splines which are smoothly blended together through the use of a partition of unity defined on a rectangular grid on the plane. The functions in this partition are univariate piecewise Hermite cubic polynomials.

A source code distribution of LOTPS is available. It is written in Fortran 77 and documented via comment statements located in each source code file. This is part of CMLIB.

[http://sunsite.doc.ic.ac.uk/public/computing/igeneral/statlib/cmlib/]

Lout
A high-level language for document formatting which translates Lout source code into PostScript source code for previewing or outputting on a wide array of printers. The goal of the Lout project was to bring to document formatting languages the elegance of expression found in programming languages like Algol-60 and Pascal, which has resulted in a document formatting system that is flexible and easily extensible.

The Lout language consists of a small kernel of carefully chosen primitives with which many advanced features are constructed including:

  • rotation and scaling,
  • variable font selection,
  • paragraph and page breaking;
  • displays and lists,
  • floating figures and tables,
  • footnotes,
  • chapters and sections,
  • running page headers and footers,
  • odd-even page layouts,
  • automatically generated tables of contents, sorted indexes and reference lists,
  • bibliographic and other databases,
  • equations, tables and diagrams,
  • Pascal program formatting, and
  • automatically maintained cross-references.

A source code distribution of the Lout language is available. It is written in ANSI C and highly portable to any platform with an appropriate compiler. The documentation includes an extensive user's guide, an expert's guide for advanced users, and a couple of technical reports, all available in PostScript format.

[http://www.ptc.spbu.ru/~uwe/lout/lout.html]
[http://sunsite.unc.edu/pub/Linux/apps/wp/lout/]
[ftp://ftp.cc.gatech.edu/pub/linux/apps/wp/lout/]
[http://tug2.cs.umb.edu/ctan/tex-archive/support/lout/lout/index.html]

LPAC
A codec for the lossless compression of digital audio files. Typical compression ratios range from 1.5 to 4 for LPAC as opposed to 11 or so for the lossy MP3 format.

[http://www-ft.ee.tu-berlin.de/~liebchen/lpac.html]

LPARX
This provides efficient run-time support for dynamic, non-uniform scientific calculations running on MIMD distributed memory architecture systems (one of which is a network of workstations running under PVM). It is intended for particle methods and adaptive multilevel finite difference methods such as adaptive mesh refinement. It is implemented as a C++ class library.

[http://www.netlib.org/c++/index.html]

LPL
The Linear Programming Language is a mathematical modeling language which can be used to build, modify and document large mathematical models. It has been used to automatically generate MPS input files and solution reports of large linear and mixed integer programs. The compiler translates LPL models to the input code format of any LP/MIP solver, calls the solver, reads the solution back to an internal representation, and writes user-defined reports in the form of tables. It is available in the form of a MS-DOS/Windows executable that may run under WINE or DOSEMU.

[http://www2-iiuf.unifr.ch/tcs/lpl/]

Lpp
A library of Lisp-like functions and macros usable in C++ programs. Lpp is designed to provide as close as possible the semantics and style of Lisp rather than try to shape it to a C++ programming style. It should be useful for:
  • porting Lisp programs to and from C++;
  • implementing embedded AI subsystems in C++ environments;
  • providing an alternative for Lisp programmers who need to program in C++; and
  • any application where dynamically typed objets are needed in C++.
The features of Lpp include:
  • dynamic typing capability by having all Lpp objects be of type let;
  • first class function objects that represent C++ functions and can be manipulated like any other Lpp object;
  • augmentation of C++ control structure with the use of first class function objects and functions funcall and apply;
  • a predicate function that returns either nil or non-nil based on a test of its given arguments;
  • an identity function useful for debugging;
  • several functions that can be used in a debugger to print and inspect objects; and
  • a facility for performing manual garbage collection.
A source code distribution is available along with a user's manual in the usual common formats.

[http://www.interhack.net/projects/lpp/]

LPRng
An enhanced, extended, and portable implementation of the Berkeley LPR print spooler functionality. LPRng provides the same interface and meets the same RFC-1179 requirements while providing such additional features as:
  • lightweight lpr, lpc, and lprm programs, i.e. no databases needed);
  • dynamic redirection of print queues;
  • automatic job holding;
  • highly verbose diagnostics;
  • multiple printers serving a single queue;
  • client programs that do not need to run SUID root;
  • greatly enhanced security checks; and
  • an improved permission and authorization mechanism.

The LPRng package includes filters for PostScript, HP, and several dumb printers. The first two filters offer page counting and produce accounting information. A wide range of additional filters is available including those that do page formatting and produce banner pages. The lpr and lpq commands can simulate the SVR4 lp and lpstat interface, eliminating the need for another spooler package. A PCNFSD server is distributed with the package along the PC/DOS/Windows-based NFS style printer spoolers. LPRng supports both Kerberos and PGP authentication methods.

A source code distribution of LPRng is available. It is written in C and can be compiled and used on most UNIX flavors via the autoconf scripts supplied in the distribution. The package is documented in several manuals and reports available in PostScript format as well as in a series of man pages.

[http://www.astart.com/LPRng.html]

lp_solve
A mixed integer linear programming solver based on the simplex method. This has been used to solve models with up to 30,000 variables and 50,000 constraints. It is written in C and available under the LGPL.

[ftp://ftp.es.ele.tue.nl/pub/lp_solve/]

lpmex
A Matlab interface to the lp_solve linear programming solver.

[ftp://ftp.mathworks.com/pub/contrib/v4/optim/lpmex/]

LPU
The Label Printing Utility is a tool for creating labels. It uses a line-oriented command language in which textual descriptions are used to create the desired objects. Language features include variables, expressions, assignments, procedures, control elements for repeated or conditional execution of statements, import of data from a textual database, and a small debugger. LPU creates output in PostScript format. A source code distribution is available.

[http://home.mainz-online.de/~jschrade/lpu.html]

LRD
A set of Matlab routines for the joint estimation of the paramters of Long-Range Dependence. LRD is commonly defined as the slow, power law like decrease at large lag of the autocovariance function of a stationary stochastic process. It has recently attracted strong interest in telecommunications with the discovery of self-similar and long range dependent properties in data and communications traffic of various types. This is documented in a technical report.

[http://www.emulab.ee.mu.oz.au/~darryl/estimation_code.html]

lrdns
The Log Reverse Domain Name System is a Perl script that converts IP addresses in access log files into textual domain names. This requires Perl 5.

[http://lambda.nic.fi/~ktmatu/lrdns/]

LRMP
The Light-weight Reliable Multicast Protocol provides a minimum set of functions for end-to-end reliable network transport suitable for bulk data transfer to multiple receivers. It is designed to work in heterogeneous network environments and support multiple data senders.

[http://webcanal.inria.fr/lrmp/index.html]

LRMPlib
An implementation of LRMP in Java. This is compatible with JDK 1.1 and 1.2.

[http://webcanal.inria.fr/lrmp/index.html]

LRP
See under distributions.

lrzsz
A communication package that provides an implementation of the XMODEM, YMODEM and ZMODEM protocols. The features of lrzsz include:
  • extensive portability via GNU autoconf;
  • crash recovery;
  • block sizes up to 8 Kb;
  • internationalization via GNU gettext; and
  • high throughput.
A source code distribution is available.

[http://194.245.36.15/uwe/lrzsz.html]

lsh
The little shell is a shell that's supposed to resemble the command interpreters of other operating systems. It doesn't have nearly as many features as the usual shells but has a reasonably comprehensive set of builtin commands. The features include:
  • case insensitive builtin commands;
  • virtual volume mappings wherein drive letters can be assigned to any part of the directory tree; and
  • hidden files not included if they cannot be accessed it its shell expansion.
A source code distribution is available.

[http://www.cs.uct.ac.za/~mwelz/lsh.html]

lsodemat
See ODEPACK.

lsof
LiSt Open Files is a program that lists open files for running UNIX processes. Source code and binary distributions are available, with the latter available for most UNIX flavors.

[ftp://vic.cc.purdue.edu/pub/tools/unix/lsof/]

LSSA
The Least Squares Spectral Analysis program performs a least squares spectral analysis on a given time series.

[ftp://ftp.geod.nrcan.gc.ca/pub/GSD/craymer/software/lssa/]

LSTT
The Least Squares Transform Toolbox is a collection of Matlab routines for performing various least squares calculations. The routines include:
  • acf, autocovariance function of evenly spaced series;
  • acfbin, autocovariance function of unevenly spaced series using equally spaced lag bins;
  • acfunb, transforms biased ACF/ACvF to unbiased ACF/ACvF;
  • acfw, weighted autocovariance function of evenly spaced series;
  • covmat, forms covariance/correlation matrix from given ACF or ACvF;
  • covmate, forms covariance function for specified times given ACF or ACvF;
  • dfs, discrete Fourier spectrum (one-sided) without correlations between frequencies;
  • dft, discrete Fourier transform;
  • ffs, fast Fourier spectrum (one-sided) up to Nyquist frequency;
  • ffsall, fast Fourier spectrum (two-sided) for all Fourier frequencies;
  • fmax, maximum frequency estimable from a given series;
  • fmin, minimum frequency estimable from a given series;
  • fnyquist, Nyquist frequency for a given series using various methods;
  • freq, natural Fourier frequencies (up to Nyquist) for a given series;
  • freqall, all natural Fourier frequencies (incl. Nyquist) for a given series;
  • gendat, generates equally or unequally spaced test data;
  • hornedata, generates test data used by Horne and Baliunas (1986);
  • idft, inverse discrete Fourier transform;
  • ilsft, inverse least squares Fourier transform without correlations between frequencies;
  • ilsftc, inverse least squares Fourier transform with correlations between frequencies;
  • ilsftce, inverse least squarews Fourier transform of an even function with correlations between frequencies;
  • ilsfte, inverse least squarews Fourier transform of an even function without correlations between frequencies;
  • lags, finds all possible lags for a series;
  • lomb, normalized periodogram for an unevenly spaced series as defined by Lomb (1976);
  • lsft, least squares Fourier transform without correlations between frequencies;
  • lsfte, least squares Fourier transform of an even function without correlation between frequencies;
  • lss, least squares spectrum (one-sided) without correlations between frequencies;
  • lssa, least squares (one-sided) spectral analysis without correlations between frequencies;
  • lssaz, east squares (one-sided) spectral analysis with zero-padding;
  • lssc, least squares spectrum (one-sided) with correlations between frequencies;
  • lssconf, confidence interval for one-sided LS spectral value;
  • range, range of a series (max-min);
  • scargle, modified periodogram as defined by Scargle (1982);
  • trend, least squares trend estimation; and
  • zeropad, pads a n-length data series with n zeros.

[ftp://ftp.geod.nrcan.gc.ca/pub/GSD/craymer/software/matlab/]

LT NSL
The LT Normalized SGML Library consists of a set of C programs for manipulating SGML files and an API designed to ease the task of writing further such programs. The programs in the package include:
  • sggrep, a grep program aware of the tree structure of SGML files;
  • sgmltrans, translates NSGML files into other formats;
  • sgrpg, an SGML selection and translation tool;
  • sgcount, for counting the number of SGML in an NSGML file;
  • sgmltoken, performs text tokenization on SGML documents;
  • sgmlseg, segments an NSGML file that has been tokenized;
  • sgmlsb, a sentence boundary marking application;
  • textonly, removes all SGML markup from an NSGML file; and
  • nslwhere, produces a summary of the positions of certain elements in an SGML file.
A source code distribution is freely available for research purposes upon completion of an online registration form. A user's guide is avaiable in the obvious formats.

[http://www.ltg.ed.ac.uk/software/nsl/]

Ltoh
A customizable LaTeX to HTML converter which handles text, tables, and hypertext links (but not equations). Ltoh is customizable in that the user can specify how to translate a given LaTeX2e macro into HTML, including the use of personal macros.

Ltoh is written in Perl and thus is portable to many platforms. A brief user's manual is currently (4/97) available. See also Hyperlatex, HyperTeX, LaTeX2HTML, tex2pdf, Tex2RTF, TeX4ht, and tth.

[http://www.best.com/~quong/ltoh/]

LTSP
An open source project for creating administration tools for setting up and maintaining diskless workstations.

[http://www.ltsp.org/]

L2CXFT
A Fortran 77 program for least-squares data fitting using non-negative second divided differences. This useful for large data sets where convexity is assured, i.e. a large class of economic models. This method restores convexity in N measurements of a convex function contaminated by random errors. It minimizes the sum of the squares of the errors - subject to non-negativity of second divided differences - in two phases. First, an approximation close to the optimum is derived. Then, this approximation is used as the starting point of a dual-feasible quadratic programming algorithm that completes the calculation of the optimum. Constaints allow B-splines to be used which reduces the problem to an equivalent one with fewer variables and where the knots of the splines are automatically determined from the data points. Iterative refinement is used to improve the accuracy of some calculations when round-off errors accumulate. This is TOMS algorithm 742 and is documented in Demetriou (1995) and Demetriou and Powell (1991).

[http://www.mirror.ac.uk/sites/netlib.bell-labs.com/netlib/toms/]
[http://www.netlib.org/toms/index.html]

L2TP
The Layer 2 Tunneling Protocol is a method for encapsulating standard PPP through various media. It also allows for the encapsulation of PPP using UDP packets. It is most commonly used to establish virtual private networks (VPN) and to separate the devices that physically accept calls (e.g. a modem) from the device terminating the PPP (e.g. a central server). L2TP consists of:
  • LAC, the L2TP Access Concentrator which physically terminates a call; and
  • LNS, the L2TP Network Server which terminates and usually authenticates the PPP stream.
An alpha release of an implementation for Linux platforms was said to be due in weeks in early 5/98.

[http://www.marko.net/l2tp/]

l2x
A general purpose converter from LaTeX to other formats. It consists of a parser written in C which calls Tcl functions for each LaTeX function which return a translated value. The translators currently (1/99) provided are:
  • l2html.tcl, for converting to a single HTML file; and
  • l2ms.tcl, for converting to nroff/troff -ms macros.
The latter translator has mostly been tested with Internet Drafts and RFCs.

[http://www.cs.columbia.edu/~hgs/l2x/]

Lua
An extension programming language designed to be used as a configuration language for any program that needs one. Lua supports general procedural programming features with data description facilities. It is implemented as a library which works embedded in a host client called the embedding program. This host program can invoke functions to execute a piece of code in Lu, read and write Lua variables, and can register C functions to be called by Lua code. C functions can be used to augment Lu to cope with different domains and thus create customized programming languages which share a syntactical framework. A Lua interpreter created with the library is included in the distribution and can be used in stand-alone mode.

Significant features of Lua include:

  • a simple, Pascal-like syntax;
  • powerful data description constructs like associative arrays;
  • user-controlled type constructors;
  • fallbacks for extending the meaning of the language in unconventional ways; and
  • the compilation of Lua programs into bytecodes which are then interpreted to simulate a virtual machine.

The Lua distribution includes the source code which is written in ANSI C. It can be compiled and used on generic UNIX platforms with an appropriate compiler. A user's and reference manual is available in PostScript format. Several useful tools and libraries available for Lua are described below.

[http://www.tecgraf.puc-rio.br/lua/]
[ftp://csg.uwaterloo.ca/pub/lhf/lua/]
[ftp://ftp.ntua.gr/pub/lang/lua/]
[ftp://ftp.uni-trier.de/pub/languages/lua/]

ldb
The Lua DeBugger is a debugging system for Lua which is provided as a C library of Lua functions to be linked with the application. It features breakpoints, step-by-step execution, automatic visualization of variables, line actions, and dynamic code execution. The source source is available as is a user's manual in PostScript format.

[http://www.inf.puc-rio.br/~roberto/ldb.html]

toLua
A tool to simplify the integration of C/C++ code with Lua. ToLua automatically generates a binding to access the C/C++ code from Lua based on a cleaned header file. It automatically maps C/C++ constants, external variables, functions, classes and methods to Lua using the Lua API and fallback facilities. The source code is available.

[http://www.tecgraf.puc-rio.br/~celes/tolua/]

TkLua
A package whick allows Lua to access Tk widgets. The source code and a manual are available.

[http://www.tecgraf.puc-rio.br/~celes/tklua/]

CGILua
A package which allows CGI scripts to be created using Lua.

[http://www.tecgraf.puc-rio.br/cgilua/]

LUCGI
A C++ library for the cross-platform development of CGI programs. This is available under the LGPL.

[http://www.iit.edu/~duerchr/]

LUCY
A Maple program that uses the general theory of Clifford algebras to perform calculations involving real or complex spinor algebra and spinor calculus on manifolds in any dimension. This allows the exploration of the structure of spinor covariant derivatives on flat or curved spaces and the correlation of the various spinor-inner products with the basic involutions of the underlying Clifford algebra. This is documented in a user's manual in TeX format.

[http://www.birkhauser.com/book/ISBN/0-8176-3907-1/wang/wang.html]

LUDE
The Logithèque Universitaire Distribuée et Extensible is a distributed software library that enables a large number of sites to pool the software packages compiled by their system administrators. Each computer can act as a client and/or server. A client needs only a network connection to a server, and a server installs packages for export via either NFS or FTP. The features and functionality of LUDE include:
  • serving heterogeneous systems;
  • allowing each disk server to decide - on a pcakge per package basis - if it wants access to, or local copies of, the executables and/or source code;
  • allowing users to access new software packages without having to edit their configuration files;
  • allowing more than one version of a package to coexist during transitions;
  • keeping each package in a separate subtree to ease the management of disk space and prevent name conflicts; and
  • all documentation available via a single user interface.
A source code distribution of LUDE is available and can be used on systems that offer tree-like file systems and symbolic links, e.g. Linux. This requires a Perl 5 installation and a few common system commmands.

[http://www.iro.umontreal.ca/lude2/]

LUG
The Librería de Utilidades Gráficas (or the Graphic Utilities Library) is a library containing functions for working with several popular graphics file formats, viewing graphics files, and performing various digital image processing tasks. The file formats it can handle include: GIF, JPEG, PBM/PGM/PPM, PCX, PIX, PostScript, RGB, RLA, RLE, SGI, TGA, and TIFF. There are viewers for X11 (using Xlib) and for the Linux text mode. The image processing tasks it can perform include: blurring, changing colors with distances, chroma, cutting, dithering, flipping vertically and horizontally, gamma correction, histogram equalization, conversion back and forth between HSL and RGB, masking, changing colors, changing to b&w, fading, median filtering, pasting, quantizing, rescaling, rotating, sharpening, and more.

A source code distribution of LUG is available as is a binary version for Linux Intel. It is written in C and can be compiled readily on most UNIX platforms. It is documented in a user's manual available in PostScript format.

[http://www3.uniovi.es/Nosotros/rivero/LUG/]

lurkftp
A tool for monitoring changes in FTP sites.

[http://mama.indstate.edu/users/dark/]

Lustre
A project to build a next-generation cluster file system. The principal design considerations are:
  • devices that can manipulate file objects, i.e. storage will be system acting as storage controllers performing I/O to block devices and exploiting the object protol on the SAN;
  • a stackable object driver model allowing the development of direct drivers, logical and client object drivers, and associated target drivers all within one hierarchical framework; and
  • a choice of object based filesystems including one meant to be used to non-shared storage devices, an inode filesystem that provides direct access to objects named by object ID, and a cluster file system.
Snapshots are currently (3/01) available in the form of experimental kernel code. User beware.

[http://www.lustre.org/]

LVM
A Logical Volume Manager is a subsystem for online disk storage management across UNIX implementations. The LVM adds another layer between the actual physical peripherals (e.g. hard disks) and the I/O interface in the kernel to obtain a logical view of the disks. It allows physical volumes (PV) (i.e. disks) to be considered as a pool of data storage consisting of equal size extents. The LVM system consists of arbitrary groups of PVs organized into volume groups (VG), each consisting of one or more PVs. The volume group is the logical analog to the hard disk as the basic unit of storage in an LVM system. It can be seen as a virtual disk consisting of more or more physical disks. The disk space of a VG can be divided into virtual partitions called logical volumes (LV), with an LV capable of spanning one or more physical volumes (with the size determined by its number of extents). The LVs can then be used like regular disk paritions to create a file system or swap space.

[http://linux.msede.com/lvm/]

Linux LVM
The Linux Logical Volume Manager implements the basic functionality of an LVM for Linux systems, i.e. read/write operations on physical volumes, creating volume groups from one or more physical volumes, and creating one or more logical volumes in volume groups. The functionality of this package is implemented as a set of utilities that can be divided into several categories. The basic LVM utilities are:
  • lvmdiskscan, scans available peripherals to find which are usable for an LVM;
  • lvmchange, changes the attributes of an LVM;
  • lvmsadc, a system activity data collector that gathers the read/write statistics of the LVM; and
  • lvmsar, a system activity reporter that presents the statistics gathered by lvsadc.
A file system integration utility is:
  • e2fsadm, for resizing a logical volume containing an unmounted ext2 filesystem and then extending the filesystem using resize2fs.

LVM tools for working with physical volumes (PVs) are:

  • pvcreate, for creating a PV on one or more disk partitions;
  • pvdisplay, displays the attributes of one or more physical volumes;
  • pvchange, for changing the allocation permission of one or more physical volumes;
  • pvmove, for moving the logical/physical extents allocated on one logical/physical volume to one or more other physical volumes;
  • pvscan, scans all disks for defined physical volumes; and
  • pvdata, displays the volume group descriptor array on a physical volume for debugging.

LVM tools for working with volume groups (VG) are:

  • vgcreate, creates a volume group with at least one PV;
  • vgdisplay, views the attributes of a VG along with its physical and logical volumes and their sizes;
  • vgchange, changes the attributes of a VG;
  • vgextend, adds PVs to a VG;
  • vgreduce, removes one or more unused PVs from a VG;
  • vgremove, removes one or more volume groups;
  • vgsplit, splits a PV from an existing VG into a new VG;
  • vgmerge, merges two existing VGs;
  • vgrename, renames an existing and inactive VG;
  • vgscan, scans all disks for VGs and builds a database for all the other commands to use;
  • vgcfgbackup, backs up the metadata of VGs;
  • vgcfgrestore, restores the VG descriptor area from a backup file;
  • vgexport, creates one or more inactive VGs for deinstalling physical volumes;
  • vgimport, makes VGs known to a system;
  • vgmknodes, creates a VG directory and special files for an existing file group for restoring deleted special files; and
  • vgck, checks VG group descriptor area consistency.

LVM command for working with logical volumes (LVs) are:

  • lvcreate, creates a new LV in a VG by allocating logical extents from the free physical extent of that VG;
  • lvdisplay, shows the attributes of an LV;
  • lvchange, changes attributes of an LV;
  • lvextend, extends the size of an LV;
  • lvreduce, reduces the size of an LV;
  • lvremove, removes one or more inactive LVs;
  • lvrename, renames an existing and inactive LV; and
  • lvscan, scans all known VGs for defined LVs.

A source code distribution of this alpha (i.e. buyer beware) code is available as a kernel patch. All the commands are documented in man pages.

[http://linux.msede.com/lvm/]

LVQ_PAK
The Learning Vector Quantization package implements a group of algorithms applicable to statistical pattern recognition. In LVQ the classes are described by a relatively small number of codebook bectors properly placed within each zone such that the decision borders are approximated by the nearest-neighbor rule. A source code distribution of this C package is freely available for scientific or research purposes. It is documented in a technical report that serves as a user's manual.

[http://www.cis.hut.fi/nnrc/nnrc-programs.html]

LWGate
The Mailing List WWW Gateway is a CGI script written in Perl that merges the Web and mailing lists by:
  • producing information about mailing lists for Web users;
  • assisting users in executing mailing list commands via a forms interface; and
  • automatically generating a hypertext interface for listing mail archives available on the local server.

[http://www.netspace.org/~dwb/lwgate/]

lxb
A Linux X11/Motif GUI builder. You can build a GUI made of Motif widgets instantiated by clicking on icons, move and resize them with the mouse, edit their resources, and move about in the widget hierarchy via the arrow keys. Once the GUI is built, the C source files, an X resource file and a makefile are built. This is software still in the development process and as such is not yet fully functional.

[http://www.umn.edu/nlhome/g257/parki005/lxb/lxb.html]
[http://sunsite.unc.edu/pub/Linux/X11/devel/]

LXR
A general hypertext cross-referencing tool first developed for use with the Linux kernel. This project aims to create a versatile cross-referencing tool for relatively large code repositories. LXR is based on Apache on the server side, although any server with CGI capabilities should suffice. The regular expression facilities of Perl are used to create the index. This index includes all identifiers, i.e. all macros, typedefs, structs, enums, unions, functions, function prototypes, and variables. The features and functionality include:
  • indexing of and quickly jumping to the declaration of any global identifier;
  • indexing of references to global identifiers;
  • quick access to function declarations, data (type) definitions and preprocessor macros;
  • quick overviews of which code areas will be afected by changing a function or type definition; and
  • email and include file links.
A source code distribution is available. Examples of the use of LXR include:

[http://lxr.linux.no/]

lxrun
A user-space program that allows users of SCO OpenServer 5.0.x, SCO UnixWare 2.1.x and 7.x, and Sun Solaris x86 2.6 and 7 operating systems to run ELF and a.out format Linux binaries. This works by remapping system calls on the fly which - given the small differences between Linux and iBCS2 binaries - usually doesn't result in much of a performance penalty. Source code distributions are available under the MPL.

[http://www.ugcs.caltech.edu/~steven/lxrun/]
[http://www.sun.com/linux/lxrun/]
[http://www.sco.com/skunkware/lxrun/]

lyngby
A Matlab toolbox for the analysis of functional magnetic resonance imaging (fMRI) time series. The purpose is to model 4-D fMRI data (i.e. 3-D volumes over time) and to derive parameter sets from them that will allow easy interpretation and identification. All the modeling methods available have low-level functions and a GUI interface for easy access to the data and modeling results.

The models available in lyngby include:

  • cross-correlation;
  • FIR filtering;
  • exhaustive FIR filtering;
  • K-means clustering;
  • grid search and iterative Lange-Zeger models;
  • the Ardekani t-test, F-test and F-test with nuisance subspace;
  • the ordinary t-test;
  • the Kolmogorov Smirnov test;
  • neural network regression and saliency;
  • Poisson filter;
  • singular value decomposition (SVD);
  • Strother CVA; and
  • Strother Orthonormalized PLS (partial least square) method.

[http://hendrix.imm.dtu.dk/software/lyngby/]

Lynx
A fully featured line-mode Web browser for those connected to the Web via cursor-addressable, character-cell terminals or emulators, e.g. VT 100 terminals and emulators. The W3C Line Mode Browser is another line-mode browser available for Linux platforms.

[http://lynx.browser.org/]
[http://www.crl.com/~subir/lynx.html]
[http://www.slcc.edu/lynx/index.html]
[http://sunsite.unc.edu/pub/Linux/apps/www/browsers/]

Lyx
A word processor that uses LaTeX as a typesetting engine. The user need not know anything about TeX to use Lyx. It is called almost-WYSIWYG since the document displayed onscreen is not exactly what will be created for printing by LaTeX, although the screen version gives a good approximation of the LaTeX formatting. There is even an almost-WYSIWG math editor that accepts standard TeX code for symbols and displays them as they will be typeset. Lyx combines the convenience of a WYSIWYG word processor with the tremendous power and flexibility of the LaTeX document formatting system.

The features include:

  • several different textclasses for creating letters, articles, books, etc.;
  • numbered headlines, tables of contents with hypertext functionality, and nested lists;
  • an interactive WYSIWYG math editor;
  • almost-WYSIWYG display and editing of documents which can be saved either as LaTeX source files or in an internal Lyx format;
  • the easy inclusion of pictures, LaTeX code or preambles, and math formulas;
  • automatic formatting for outlines, tables, letters, footnotes, lists, formulas, bibliographies, and more;
  • spell checking using ispell;
  • a pop-up window for navigating between sections;
  • automatic handling of references and support for BibTeX;
  • several internationalization features including the display of character sets onscreen and menus in several languages;
  • automatic compilation of documents using LaTeX;
and more.

A source code version of Lyx is available as are binaries for Linux Intel and IBM AIX platforms. A user's manual is under development as is a FAQ, both of which are available in the obvious formats.

[http://www.lyx.org/]

LZO
The Lempel-Ziv-Oberhumber package is a portable lossless data compression library written in ANSI C which offers fast compression and very fast decompression, with the latter requiring no memory, i.e. it is intended for use in real-time applications. There are also slower compression levels for achieving greater compression ratios which still decompress at a very high speed. The current (8/97) release includes 8 compression algorithms for wide backwards compatibility with other software. A source code distribution of LZO is available. It is written in ANSI C and has been compiled on a wide range of platforms. See the related lzop and miniLZO packages.

[http://wildsau.idv.uni-linz.ac.at/mfx/lzo.html]

lzop
A compression utility based on the LZO library which is designed to be a companion to gzip. The advantages of lzop are much higher compression and decompression speeds at the cost of the compression ratio. It was designed to be reliable, portable, and have a reasonable drop-in compatibility with gzip, with the functionality and behavior modeled very closely after gzip. Files compressed with lzop have the suffix .lzo. A source code distribution of lzop is available as is a binary for Linux platforms. It is written in ANSI C and portable to many platforms.

[http://wildsau.idv.uni-linz.ac.at/mfx/lzop.html]

[ home / linux ]


next up previous contents
Next: Ma-Mm Up: Linux Software Encyclopedia Previous: La-Lm   Contents
Manbreaker Crag 2001-03-08