The dynamic loader is the system by which modules, and their dependencies can be loaded, unloaded or reloaded at runtime, and through which we access the functions we need.
At its lowest level, the hs-plugins loader is a binding to the GHC runtime loader and linker. This layer is a direct reimplementation of Andre Pang’s runtime_loader (barely any code changed). The code at this level can only load single modules, or packages/archives (which are just objects too). Any dependency resolution must be performed by hand.
On top of Andre’s interface is a more convenient interface through which user’s should interact with the dynamic loader. The most significant extension to Andre’s work is the automatic calculation and loading of a plugin’s package or module dependencies via .hi file information. It also handles initialisation of the loader, and retrieval of values from the plugin in a more convenient way. Some state is also stored in the loader to keep track of which modules and packages have been loaded, to prevent unnecessary (actually, fatal) loading of object files and packages already loaded. Thus you can safely load several plugins at once, that share common dependencies, without worrying about the dependencies being loaded multiple times. We also store package.conf information in the state, so we can work out where a package lives and what it depends on.
The ability to remember which packages and objects have been loaded is based on ideas in Hampus Ram’s dynamic loader, which has a more advanced dependency tracking system, with the ability to unload the dependencies of a plugin. hs-plugins doesn’t provide “cascading unloading”. The advantage hs-plugins has over Hampus’ loader seems to be the automatic dependency resolution via vanilla .hi files and the dynamic recompilation stuff.
Using load, any library packages, or any .o files, that a plugin depends upon will be automatically loaded prior to loading the module itself. load then looks up a symbol from the object file, and returns the value associated with the symbol as a conventional Haskell value. It should also be possible to load a GHCi-style .o archive of object files this way, although there is currently no way to extract multple plugin interfaces from a archive of objects.
The application writer is not required to recalculate dependencies if the plugin changes, and the plugin author does not need to specify what dependencies exist, as is required in the lower level interface. This is achieved by using the dependency information calculated by GHC itself, stored in .hi files, to work out which modules and packages to load, and in what order. A plugin in hs-plugins is really a pair of an object file (or archive) and a .hi file, containing package and module dependency information.
The .hi file is created by GHC when the plugin is compiled, either by hand or via make. load uses a binary parser to extract the relevant information from the .hi data. Because the dependency information is stored in a separate file to the application that loads the plugin, such information can be recalculated without having to modify the application. Becaues of this, it was easy to extend the load to support recompilation of module source, even if dependencies change, because dependencies are no longer hard-coded into the application source itself, but are specified by the plugin.
Assuming we have a plugin exporting some data, “resource”, with a record name field :: String, here is an example call to load:
do m_v <- load "Test.o" ["."] [] "resource" v <- case m_v of LoadSuccess _ v -> return v _ -> error "load failed" putStrLn $ field v
This loads the object file Test.o, and any packages or objects Test.o depends on. It resolves undefined symbols, and returns from the object file the Haskell value named “resource”, as the value “v”. This must be a value exported by the plugin. We then retrieve the field component of v, and print it out.
This simple usage assumes that the plugin to load is in the same directory as the application, and that the api defining the interface between plugin and application is also in the current directory (hence the “.” in the 2nd argument to load).
It is also possible to load the plugins or eval libraries in GHC. Here, for example, we load the plugs interactive environment in GHCi, and evaluated some code. The source to plugs is in Appendix sec:plugs.
paprika$ ghci -package-conf ../../../plugins.conf.inplace -package eval ___ ___ _ / _ \ /\ /\/ __(_) / /_\// /_/ / / | | GHC Interactive, version 6.3, for Haskell 98. / /_\\/ __ / /___| | http://www.haskell.org/ghc/ \____/\/ /_/\____/|_| Type :? for help. Loading package base ... linking ... done. Loading package altdata ... linking ... done. Loading package unix ... linking ... done. Loading package mtl ... linking ... done. Loading package lang ... linking ... done. Loading package posix ... linking ... done. Loading package haskell98 ... linking ... done. Loading package haskell-src ... linking ... done. Loading package plugins ... linking ... done. Loading package eval ... linking ... done. Prelude> :l Main Skipping Main ( Main.hs, Main.o ) Ok, modules loaded: Main. Prelude Main> main Loading package readline ... linking ... done. __ ____ / /_ ______ ______ / __ \/ / / / / __ `/ ___/ PLugin User's GHCi System, for Haskell 98 / /_/ / / /_/ / /_/ (__ ) http://www.cse.unsw.edu.au/~dons/hs-plugins / .___/_/\__,_/\__, /____/ Type :? for help /_/ /____/ Loading package base ... linking ... done plugs> map (\x-> x + 1) [0..10] [1,2,3,4,5,6,7,8,9,10,11] plugs> :t "haskell" "haskell" :: [Char] plugs> :q *** Exception: exit: ExitSuccess Prelude Main> :q Leaving GHCi.
Support is also provided to unwrap and check the type of dynamically typed plugin values (those wrapper in a toDyn) via dynload. This is the same as load, except that instead of a returning the value it finds, it unwraps a dynamically typed value, checks the type, and returns the unwrapped value. This is to provide further trust that the symbol you are retrieving from the plugin is of the type you think it is, beyond that trust you have by knowing that the plugin was compiled against a shared API. By using dynload it is not enough for an object file to just have the same symbol name as the function you require, it must also carry the Data.Dynamic representation of the type, too. pdynload rectifies most of dynload’s limitations, but at the cost of additional running time.
Along side the dynamic loader is the compilation manager. This is a make-like system for compiling Haskell source, prior to loading it. make checks if a source file is newer than its associated object file. If so, the source is recompiled to an object file, and a new dependency file is created, in case the dependencies have changed in the source. This module can then be loaded. The idea is to allow EDSL authors to write plugins without having to touch a compiler: it is all transparent. It also allows us to enforce type safety in the plugin by injecting type constraints into the plugin source, as has been discussed eariler.
The effect is much like hi (Hmake Interactive), funnily enough. An application using both make and load behaves like a Haskell interpreter, using eval. You modify your plugin, and the application notices the change, recompiles it (possibly issuing type errors) and then reloads the object file, providing the application with the latest version of the code.
An example:
do status <- make "Plugin.hs" [] obj <- case status of MakeSuccess _ o -> return o MakeFailure e -> mapM_ putStrLn e >> error "failed" m_v <- load obj ["api"] [] "resource" v <- case m_v of LoadSuccess _ v -> return v _ -> error "load failed" putStrLn $ field v
make accepts a source file as an argument, and a (usually empty) list of GHC flags needed to compile the object file. It then checks to see if compilation is required, and if so, it calls GHC to compile the code, with and arguments supplied. If any errors were generated by GHC, they are returned as the third component of the triple.
Usually it will be necessary to ensure that GHC can find the plugin API to compile against. This can be done by either making sure the API is in the same directory as the plugin, or by adding a -i flag to make’s arguments. If the API is created as a “package” with a package.conf file, make can be given -package-conf arguments to the same effect.
Normally, make generates the .o and .hi files in the same directory as the source file. This is not always desirable, particularly for interpreter-like applications. To solve this, you can pass [”-odir”, path] as elements of the arg list to make, and it will respect these arguments, generating the object and interface file in the directory specified. GHC’s argument ”-o” is also respected in a similar manner, so you could also say [”-o”, obj] for the same effect.
make is entirely optional. All user’s have to do to use the loader on its own is make sure they only load object files that also have a .hi file. This will usually be the case if the plugin is compiled with GHC.
makeWith merges two source files together, using the function and value declarations from one file, with any syntax in the second, creating a new third source file. It then compiles this source file via make.
This function exists as a benefit to EDSL authors and is related to the original motivation for hs-plugins, as a .conf file language library. Configuration files need to be clean and simple, and you can’t rely, or trust, the user to get all the compulsory details correct. So the solution is to factor out any compulsory syntax, such as module names, imports, and also to provide a default instance of the API, and store this code in a separate file provided by the application writer, not the user. makeWith then merges whatever the user has written, with the syntax stub, generating a complete Haskell plugin source, with the correct module names and import declarations. We also ensure the plugin only exports a single interface value while we are here.
makeWith thus requires a Haskell parser to parse two source files and merge the results. We are merging abstract syntax here. This is implemented using the Language.Haskell parser library. Unfortunately, this library doesn’t implement all of GHC’s extensions, so if you wish to use makeWith you can only write Haskell source that can be parsed by this library, which is just H98 and a few extensions. This is another short coming in the current design that will be overcome with -package ghc. Remember, however, for normal uses of make and load you are unrestricted in what Haskell you use. This is the same restriction present in happy, the Haskell parser, placed on the code you can provide in the .y source.
makeWith also makes use of line pragmas. If the merged file fails to compile, the judicious use of line number pragmas ensure that the user receives errors messages reported with reference to their source file, and not line number in the merged file. This is a property of the Language.Haskell parser that we can make use of.
An example of makeWith:
do status <- makeWith "Plugin.in" "Plugin.stub" [] obj <- case status of MakeFailure e -> mapM_ putStrLn e >> error "failed" MakeSuccess _ o -> return o m_v <- load obj [apipath] [] "resource" v <- case m_v of LoadSuccess _ v -> return v _ -> error "load failed" putStrLn $ field v
We combine the user’s file (Plugin.in) with a stub of syntax generating a new, third Haskell file in the default tmpdir. This is compiled as per usual, producing object and interface files. The object is then loaded, and we extract the value exported.
Using makeWith it is possible to write very simple, clear Haskell plugins, that appear not to be Haskell at all. It is an easy way to get EDSL user’s writing plugins that are actually Haskell programs, for .e.g, configuration files. See the examples that come with the src.