Monday, August 23, 2010

WekaSharp: An F# wrapper for Weka

There are 3 posts in this series:

1. An F# wrapper for Weka. (this post) The minimal wrapper in F# for Weka.

2. More features. This post will contain improvement over the minimal wrapper, e.g. more Dataset processing function, some plot functionality, etc.

3. Tutorial/Usage examples. This post is for end users, who might have no interested in reading the implementation details, but rather knowing how to use this wrapper to perform data mining tasks in .Net.

WekaSharp is available at CodePlex:

Weka is one of the most widely used data mining software package. It contains over 50 data mining algorithms, a good GUI support and well written documents. Because of its good features, a lot data mining courses use it as an illustrative software.

Its GPL license also allows it to be used freely in academic.

However, Weka GUI enables us to perform data loading, filtering and classifier building in an interactive way, sometimes programming is still needed. For instance, GUI has poor support for parameter selection and processing 100 data sets. In these tasks, scripting is a better way. However, Java language is not declarative and it does not support interactive shell either.

F# well supports these two features, also data processing in F# in general is far better than Java. So F# and Weka are good combination.

Compile Weka library to a .Net dll

As everybody knows, Weka is written in Java. To use it within .Net framework seamlessly, we need to compile it into a .Net assembly. Fortunately, IKVM project is made for such purpose.

There is an online tutorial for compiling weka.jar to weka.dll. However, this tutorial does not mention how to use external jar libraries in Weka, e.g. libsvm.jar. Because Java’ class loading mechanism is different from .Net’s. We cannot simply compile different jars into separate dlls. The simplest way is to compile weka.jar and libsvm.jar to a single dll:

>ikvmc -sharedclassloader -target:library weka.jar libsvm.jar

Notice that there is a WekaDotNet project at sourceforge. The project is compiled using Visual J#, which is already deprecated in Visual Studio family. Using IKVM to compile jar into .Net is an easier and more stable way.

The overall structure of the wrapper

Based on the Weka class hierarchy, I’ve defined several separate modules for different kinds of functionality:

* Common – contains all common types over the whole wrapper. E.g. the union type for various classifiers.

* Dataset – contains IO routines to read/save various datasets and functions (e.g. split) to manipulate/preprocess datasets.

* Parameter – contains functions to make parameters for the data mining algorithms, all the default parameters are also provided.

* Classify – contains standard classifiers.

* Cluster – contains standard clustering algorithms.

* Eval – contains functions to perform different kinds of classification/clustering tasks and output their result (e.g. accuracy/F1 measure) out.

In the following sections, I will present each module in details.

Common module

In this module, I try to write all types, including classifier types, clustering algorithm types and some evaluation-task types. This module is marked as AutoOpen, i.e., the types in the modules are easily accessible from other modules.

module Common =
let NYI() = failwith "Not yet implemented!"

type Dataset = core.Instances

type parameters = string

type DatafileType =
Arff | LibSVM | Csv | Svmlight

type ClassifierType =
J48 | LogReg | NeuralNet | KNN | NaiveBayes | SVM | LibLinear

type ClaEvalTask =
| CrossValidation of int * Dataset * ClassifierType * parameters
| RandomSplit of float * Dataset * ClassifierType * parameters
| TrainTest of Dataset * Dataset * ClassifierType * parameters

type ClustererType =
KMeans | EM | DBScan

I’ve redefined Weka Instances class as Dataset, and use a string for parameters.

Dataset module

This module includes functions to read/save 4 kinds of data files: Weka’s ARFF, libsvm, Csv and SvmLight. It also has a function randomSplit to randomly split a dataset with a given ratio.

This module currently is far from complete. For instance, we can only load datasets from 4 kinds of disk files, and no way to build a dataset using in-memory data, e.g. an array of vectors. We also only provide one data preprocessing step – random split, many other common preprocessing steps are needed.

I’d like to use another post to finish these. In this post, delivering the whole working wrapper (although incomplete) is more important.

Here is some of the implementation:

    let readDataFile (ftype: DatafileType) (fname:string) =
let loader =
match ftype with
| Arff -> new converters.ArffLoader() :> AbstractFileLoader
| LibSVM -> new converters.LibSVMLoader() :> AbstractFileLoader
| Csv -> new converters.CSVLoader() :> AbstractFileLoader
| Svmlight -> new converters.SVMLightLoader() :> AbstractFileLoader
| _ -> failwith "dataset loading error"

its input is a datafiletype and the file name. Based on this function, 4 concrete functions are defined:

let readArff = readDataFile DatafileType.Arff
    let readLibsvm = readDataFile DatafileType.LibSVM
let readCsv = readDataFile DatafileType.Csv
let readSvmlight = readDataFile DatafileType.Svmlight

The data saving functions have similar implementation.

Parameter module

This module contains functions that create parameters for different data mining algorithms. As Weka uses a space-separated string as parameters, it would be unclear to the first-time user that what does “-R 1.0 –M -1” mean for a logistic regression.

I have provided a default parameter string for each algorithm.

Classify module

In this module, I have wrapped most commonly used classification algorithms: C4.5 decision tree (named J48 in Weka), NaiveBayes, Logistic Regression, Multilayer perception neural net, Support Vector Machines (SVM).

The getClassifier function accepts a classifier type and the an option string, and returns a classifier instance:

let getClassifier (ctype:ClassifierType) (option:string)  =
let classifier =
match ctype with
| J48 -> new classifiers.trees.J48() :> classifiers.Classifier
| NaiveBayes -> new classifiers.bayes.NaiveBayes() :> classifiers.Classifier
//| KNN ->new classifiers.lazy.IBk() :> weka.classifiers.Classifier
| LogReg -> new classifiers.functions.Logistic() :> classifiers.Classifier
| NeuralNet -> new classifiers.functions.MultilayerPerceptron() :> classifiers.Classifier
| SVM -> new classifiers.functions.LibSVM() :> classifiers.Classifier
| LibLinear -> new classifiers.functions.LibLINEAR() :> classifiers.Classifier
| _ -> failwith "not supported"

classifier.setOptions(core.Utils.splitOptions option)
and buildClassifier takes one more parameter – the dataset to build a trained classifier:

let buildClassifier (ctype:ClassifierType) (option:string) (ds:Dataset) =
checkDataset ds
let classifier = getClassifier ctype option
Handy functions like getJ48, getSVM are also defined:

let getJ48 (option:string) = getClassifier ClassifierType.J48 option
let getSVM (option:string) = getClassifier ClassifierType.SVM option

There are several issues:

1. KNN (Weka class: IBk) is not supported as its namespace in Weka is core.classifiers.lazy. But lazy is a keyword in F#, thus cannot appear in a namespace.

2. The IO wrapper for classifiers are not provided yet.

Cluster module

Similar to Classify module, there are getClusterer and buildClusterer in Cluster module. There are also shortcuts to concrete clustering algorithms such as getKmeans and buildKmeans.

Eval module

This module contains evaluation methods for classification and clustering. For classification, I’ve defined a task discrete union:
type ClaEvalTask =
    | CrossValidation of int * Dataset * ClassifierType * parameters
| RandomSplit of float * Dataset * ClassifierType * parameters
| TrainTest of Dataset * Dataset * ClassifierType * parameters
classifyEval function is to do such a task:

let rec classifyEval (task:ClaEvalTask) =
let rnd = new java.util.Random(System.DateTime.Now.Ticks)
match task with
| TrainTest (dsTrain, dsTest, ctype, para) ->
Classify.checkDataset dsTrain
Classify.checkDataset dsTest
let eval = new classifiers.Evaluation(dsTrain)
let cl = Classify.buildClassifier ctype para dsTrain
eval.evaluateModel(cl, dsTest) |> ignore
| CrossValidation(cv, ds, ctype, para) ->
Classify.checkDataset ds
let eval = new classifiers.Evaluation(ds)
let cl = Classify.getClassifier ctype para
eval.crossValidateModel(cl, ds, cv, rnd, Array.empty)
| RandomSplit(ratio, ds, ctype, para) ->
Classify.checkDataset ds
let train, test = Dataset.randomSplit ratio ds
classifyEval (TrainTest(train, test, ctype, para))

the result of this function is an evaluation result object. We can use its properties to get various evaluation metrics, or use the shortcut functions in the Eval module:

let getAccuracy (eval:classifiers.Evaluation) = eval.pctCorrect()
let getPrecison (eval:classifiers.Evaluation) (classIdx:int) = eval.precision(classIdx)
let getRecall (eval:classifiers.Evaluation) (classIdx:int) = eval.recall(classIdx)
let getF1 (eval:classifiers.Evaluation) (classIdx:int) = eval.fMeasure(classIdx)
let getAUC (eval:classifiers.Evaluation) (classIdx:int) = eval.areaUnderROC(classIdx)
let getClassifySummary (eval:classifiers.Evaluation) = eval.toSummaryString()


Two features of the wrapper are

1) a declarative wrapper to Weka
. See the following code to see how declarative the wrapper is:

// load the dataset
let dataset =
@"C:\Program Files\Weka-3.6\data\iris.arff"
|> Dataset.readArff
|> Dataset.setClassIndexWithLastAttribute

// describe the cross validation task using SVM as the classifier
let classifyTask = CrossValidation(5, dataset, ClassifierType.SVM, Parameter.SVM.defPara)

// perform the cross validation task and then get the average accuracy
let cvAccuracy =
|> Eval.classifyEval
|> Eval.getAccuracy
The code is understandable even to those who do not know F#.

2) interactively perform data mining algorithms.

F# interactive is well integrated in Visual Studio. It is very stable and convenient to use. The powerful VS IDE provides good Intellisense for F#.

Saturday, August 14, 2010

Reading F# Projects, Part III: The F# math providers.


We’ve covered the usage of Math providers in the matrix series. In this post, the whole source code is decomposed.

As I find it time consuming to covert One Note notes into a blog format. Please see the attached pdf file.




Reading F# Projects, Part II: F# Set

The design patterns in a data structure project like F# Set is simple. However, understanding it really needs time as data structures involve heavy generic programming and the type system in F# (or any other static FP language) is very complex.

In this post, we will explore two aspects of F# Set:

1. The general implementation structure for a generic data structure. After knowing the pattern, we could write other data structures in F# and these data structures are generic enough.

2. Know something about generics in F# and type inference. We will come to this part in later readings.

In F# source code, the immutable set implementation (set.fs) is a good one for us to start for data structures. Why not more fundamental data structures like F# sequences? Because those ones are more involved in .Net platform and F# Set is nearly pure F# code. Also because:

I love trees!

F# Set and Map are implemented using balanced binary search tress, to be exact, AVL tress. Balanced BSTs support insert, delete and find operations all in O(log n). So it is widely used for implementing dynamic sets.

In general, trees are the most beautiful data structures and have so many variants. They are also efficient, e.g. R tree and friends are the fundamental data structures used in every database. Quadtrees are used for computational geometry in theory and spatial database in practice.

The F# Set is implemented by an immutable AVL tree. Here immutable means that every node, once created, its values could be be changed. In CS literature, researchers use “persistent” to name immutable data structures. If you do 100 insertions to an empty tree, you can have 101 version of trees, where common nodes are shared to reduce the space cost. It is proved that in such a tree, inserting operation still costs O(log n) time, and O(log n) space for duplicating some nodes in the tree (usually along the insertion path).

As this post is not intended to be a tutorial on data structures, I won’t go into details for the tree implementation. We mainly study how an industry-strength data structure is implemented in F#.

The main structure

set.fs contains about 840 lines of code. The code skeleton is shown below:

namespace Microsoft.FSharp.Collections
type SetTree<'T> when 'T : comparison =
| SetEmpty // height = 0
| SetNode of 'T * SetTree<'T> * SetTree<'T> * int // height = int
| SetOne of 'T // height = 1
module internal SetTree =
// functions:
// indert, delete, find, etc.
// split, rebalance, etc.

type SetIterator<'T> when 'T : comparison =
{ mutable stack: SetTree<'T> list; // invariant: always collapseLHS result
mutable started : bool // true when MoveNext has been called
type Set<'T when 'T : comparison >(comparer:IComparer<'T>, tree: SetTree<'T>) =
//member functions:
// Add, Remove, Contains, etc.
// - + Intersection, etc.
// GetHashCode, Equals, ToString
//implement interfaces:
// IComparable
// ICollection<'T>
// IEnumerable<'T>
// IEnumerable
module Set =
// set module functions:
// add, isEmpty, etc.

Under Microsoft.FSharp.Collections namespace, there are three things: SetTree<’T> type, Set module and SetTree module.

SetTree is a type constructor, the parameter is a comparable type ‘T.

Most of the code is in SetTree internal module:

the first part contains the code to manipulate the tree structure, e.g. inserting a node, deleting a node, etc.

Type SetIterator<’T> and related functions are for implementing IEnumerable interface.
Set<’T> (comparer, tree) is the implementation for Set.

Set module contains the common function to manipulate a set.

In a summary, A container, like Set, usually follows this pattern:

1. An implementation of the underlying data structure, here it is AVL tree.

2. A module for manipulating the data structures; members functions also can do this. Usually we keep functions in both places.

3. Implement the IEnumerable and IEnumerable<’T> interface, so that the container/data structure could be treated as a sequence.

These are kind of rules for implementing new containers in F#, e.g. F# Map.

Detail 1. The generic

It is meaningless if the Set implementation only supports integers, or floats. We want it to be generic, i.e., support any type. Since the underlying implementation is based on trees, so the base type should also support comparison.

The other part of generic, which is more subtle, is the comparer. There are situations, we have different string sets. However, the rules to order them are not the same, i.e., strings maybe compared using different rules. So we should add another parameter into consideration. (Refer to Functor in OCaml if you want to see a more elegant more to deal with this situation.)

Although F# Set currently does not support different comparers. Its internal implementation keeps comparer in mind. (Because F# supports OOP, the different comparers problem could be partially solved by inherit the original element type. Anyway, this is non-elegant.)

We know that once a Set has some concrete values in it, the type is clear. Here are two ways to initialize a Set:

static let empty : Set<'T> =
    let comparer = LanguagePrimitives.FastGenericComparer<'T>
new Set<'T>(comparer, SetEmpty)

new (elements : seq<'T>) =
let comparer = LanguagePrimitives.FastGenericComparer<'T>
new Set<_>(comparer,SetTree.ofSeq comparer elements)

We can see that they get the comparer for type ‘T and use that comparer to pass into the constructor of type Set<’T>.

Detail 2. The comparer

The above code get the comparer for type ‘T using FastGenericComparer, which is in a big file prim-types.fs. The primitive type module should be discussed separately and thoroughly. There are quite a few ad hoc tricks in the implementation.

Here we just need to know that it looks up a table to find a comparer and if the type is in the primitive types, then there are optimizations for them.

Detail 3. From the empty to concrete

F# allows you to write [], Set.Empty. These empties seem to be dynamic. Actually no. You cannot write sth like:

let empties = Array.create 100 Set.empty

the compiler gives the following error:

error FS0030: Value restriction. The value 'empties' has been inferred to have generic type
    val empties : Set<'_a> [] when '_a : comparison  

Because the compiler cannot generalize these empties into a single type. It is safe to write

Set.empty + Set([1;2;3])

Because the types of the two sets could be generalized.

Same situation applies to the zeros in different number types.

Detail 4. The Iterator

interface IEnumerable<'T> with
s.GetEnumerator() = SetTree.mkIEnumerator s.Tree

interface IEnumerable with
s.GetEnumerator() = (SetTree.mkIEnumerator s.Tree :> IEnumerator)
Both of the IEnumerable interfaces are implemented. Notice that difference between the two: there is a upcast in the type. Let’s see the actual implementation:

let mkIterator s = { stack = collapseLHS [s]; started = false }

let mkIEnumerator s =
    let i = ref (mkIterator s)
{ new IEnumerator<_> with
x.Current = current !i
interface IEnumerator with
x.Current = box (current !i)
member x.MoveNext() = moveNext !i
member x.Reset() = i := mkIterator s
interface System.IDisposable with
x.Dispose() = () }


An exercise is to read map.fs in F# core. The implementation of F# Map is very similar to F# Set.

Reading F# Projects, Part I: The Common Knowledge

During the programming of my data mining library, I occasionally refer to the design of other libraries. E.g. data mining toolkits WEKA and MALLET are well designed. I also refer to the design of F# libraries. The examples on F# books mainly tell us how to program on small level, i.e. how to manipulate list or sequence and some language features. However, designing in F# is often covered in a general way.

Thus, it is beneficial for new F# users to read well designed F# libraries. In this series, the design of the following source code will be presented: (the list is tentative to change.)

* some parts of F# Core and PowerPack.

* F# math providers -- service design pattern and PInvoke in F#.

* TrueSkill – how a specific data mining model is implemented.

* FParsec -- the best example of computation expression/monad.

Before going to any concrete projects, we review the common issues/tips/concepts that may be involved in the process of the project design in F#.

I will add more topics here as we go through different F# programs.

The interface/signature file .fsi

In Java or C#, the implementation and interface are put in one. In C/C++, the interface (.h) and the implementation (.c/c++) are separated. Either design is good and has advantages.

For F#, I think the later is better. In .fsi, all the public functions/types are commented carefully. These commented can be used to generated documents for the program, or used by the IDE Intellisense.

The actual implementation is in the .fs files. In this file, comments are in places where actually necessary for the program who implements it, e.g. a formula, or a refer to a page number in a book. Because the program who writes the function definitely know what every parameter mean. If this kind of comments are put into the implementation, then the programs look less succinct.

So usually, the programmer doing the implementation write the implementation in .fs. When the implementation is stable or ready to ship out, the corresponding .fsi interface file is generated and well commented manually.

Here is the MSDN page on signature file.

Namespaces and Modules

The latest F# version requires that an F# source file (.fs) starts with a namespace, that means all the following code is under this namespace(An exception: there could be multiple namespaces in a source file.)

Modules are similar to namespaces. You can view that a group of functions are put into a namespace.

The usage difference between namespaces and modules, in my option, is that namespace is broader concept. E.g. Microsoft.FSharp.Collections names contains a lot of standard data structures, where the manipulating functions for each data structure are put into a module.

You can also view modules as a class containing only static members.

Classes and Interfaces

F# interfaces are just like interfaces in C#/Java. Interface is a quite universal concept occurring different programming paradigms. Even in the pure FP Haskell, type classes are similar to interfaces.

F# classes are different from classes in OO languages. First, F# encourages an immutable programming fashion for classes: you have only one main constructor, once the object is constructed the values remain unchanged. You can also have mutable member fields in a class. The constraints on F# classes are for safer programs, although sometimes causes some inconvenience.

Besides classes, F# also has Records, Enums and Discrete Unions.

Extension to existing classes and modules

In F#, you can extend an existing .net class by using “type .. with ..” construct:

type System.Net.WebRequest with
member this.AsyncGetResponse() : Async<System.Net.WebResponse>=
async {… the implementation }
(from F# source code: control.fs)

In this piece of code, System.Net.WebRequest is an existing class in .Net. We add a member function AsyncGetResponse into this class. This is like lightweight inheritance by saving from creating a new class.

The attributes

In .Net, attributes associate declarative information with the code. Usually they occur in production/formal code. Here is a tutorial.

Here I list some commonly used attributes in F#:


To see a full list, and the exact meaning. No resource is better than the F# Specification, Chapter 16. Special Attributes and Types.