NADIA Software Project Page

Quick Links

All questions and comments should be directed to the mailing list, cxs-software@physics.unimelb.edu.au

Software Manual - Including Installation Instructions (original) (pdf) (html). Note that this document is currently in the process of being written and is incomplete at this moment.

Package source code tar-ball - Release 0.6 (this is a test release with basic functionality and includes IDL routines). Tested on Ubuntu, osiris. See the Changelog for a description of what's new.

32 bit binary files for Windows - Release 0.5 (this is a test release with basic functionality and includes IDL routines). Tested on: Windows XP, Windows 7.

Workbook and examples code tar-ball - This is a workbook and with example templates. Tested on Ubuntu, osiris.

Some preliminary specifications (written by Brian) - please give feedback on them!

User questionnaire

Package Description

Doxygen documentation of the source code (for c/c++ users)

IDL documentation of the available routines

Presentations

To Do List

Useful Tools

Package Description

Goals

CXS has developed a number of important X-ray analysis algorithms that it seeks to make available to other members of CXS and to the wider science community. The aim of this project is to provide a standard software package for these algorithms and for basic X-ray diffraction data analysis. In particular, it should allow new member of CXS to be introduced to the image reconstruction algorithms in a user-friendly way. In achieve this, the software must be well documented, efficient, robust and work on a number of different platforms.

Current Functionality

For a list of the proposed final functionality please look at the
specifications. Currently, the following algorithms/functions have been implemented:

Technical Details

The package has been written in C++ and is structured as: If time allows and there is a need for it, a simple GUI may also be constructed.

A comprehensive description of the code is given in the doxygen documentation. This should be useful, in particular, to users familiar with C++ and wanting to use the library option as it lists the various functions available.

How to get Started

You should find all the information you need to start working with the software in the "How to..." documentation.

To Do List

Useful tools

HDF File Navigator

This is a useful tool for quickly seeing the structure of a HDF file and viewing the image that it contains. To install, go to
http://www.hdfgroup.org/hdf-java-html/hdfview/ and follow the instructions for your operating system. Opening and looks at the file should be straight forward. To view the data as a image rather than a table of numbers, just right click on the data and select "view as image.." (I need to check the exact names).

A script for automatically redirecting a job's output to a subdirectory

When you run the examples from the software package you'll find that image and other files are put into the directory where you're working. If you want to run a reconstruction several times (for example with a different set-up), you will probably want the output put automatically into different subdirectories.

I've written a really simple bash script which puts a job's output (images and standard output) into a subdirectory (with data and time as the folder name). It will also copy the .c file to the directory so you have a record of what was run. It is downloadable from here.

Some instructions:

Put the script into the directory where your code is and type:
chmod +x dir_maker.sh
You only need to do this once.

Then each time you want to run your code type
./dir_maker.sh my_code_name

Note that you need to give the script your executables file name minus the .exe part at the end. You can adjust the script a bit to make the directory name how you like.

Also, you will need to change all the input image file names to "../image_files/something.tiff" from "/image_files/something.tiff" because it now run the program in the subdirectory.

A very useful tool for checking code performance

The google profiler is a great tool for profiling the performance of code. I highly recommend it for your own code! I ran it on the real_example.exe program to check the efficiency of the code so far. Currently about 50% of the time is spent executing the FFTW libraries (up from about 12% when I first ran it!). There is still room for improvement.