-
Notifications
You must be signed in to change notification settings - Fork 0
Home
Boost is a collection of free, peer-reviewed C++ libraries. We emphasize libraries that work well with the C++ Standard Library. Boost libraries are intended to be widely useful, and usable across a broad spectrum of applications. The Boost license encourages both commercial and non-commercial use.
uBLAS is a C++ template class library that provides BLAS level 1, 2, 3 functionality for dense, packed and sparse matrices. The design and implementation unify mathematical notation via operator overloading and efficient code generation via expression templates.
The project description is simple: add support of multicore parallel and GPU computations to uBlas ! The realization is not straightforward though. Boost.uBlas is CPU only. If the compilers is able to vectorize, uBlas can take benefit of it. Here we want to extend Boost to the support of parallel architecture and GPU computations to enable it to do big data or deep learning computations.
The student will have to first understand how ublas works and how it generates and optimizes code with the expression template mechanism and then start adding options to enable the use of Boost.Compute. Test will be done on multicore systems and graphics card or computers which support Boost.Compute (through OpenCL for example).
We expect to see the basic matrix operations to be implemented like this. The code will have to be thoroughly documented and a tutorial document provided. We prefer quality of the implementation to exhaustivity.
Here's a link to the accepted proposal (it has the milestones too)
to sum it up the proposal was to add functions that does the matrix operations on any device that supports opencl and add mechanism for if the user wants to keep matrices on them to do further computations on the device.. but then during the project I finished that and extended the project to include vectors, their operations and matrix-vector operations.
This documentation describes in details:
- Why should I use the project additions?
- How to use them and how to set up your machine?
- Classes
- Supported operations
- Examples
This is the branch I was working on
All commits I have done can be found from here:
-
added opencl/opencl_core.cpp : which contains the classes and the library setting up api
-
added opencl/operations.cpp : which has the implementation of all the supported operations
-
added all the files that are in test/opencl/ : The files there test all the operations supported with all the data types in these operations.. note : they are tested with each commit online using (travis CI and appveyor)
-
added benchmarking for all the uBLAS operations (cpu ones and opencl ones) in folder benchmarks/ (all files presented directly in the folders) and all the files in folder benchmarks/opencl/
note : the new benchmarking files produce files that are plotted into graphs for comparisons using benchmarks/plot.py (for more infromation refer to the documentation above)
- edited test/Jamfile and benchmarks/Jamfile.v2 to enable running benchmarking and tests automatically with the rest of the library
I want to thank my mentor Stefan Seefeld that supported me with all the guidance and info needed through designing and implementing the project and also I want to thank google for giving me the opportunity to contribute in such big project and improve my experience throught it