I'll keep updating this article as I go, just to put stuff in all the same place.

Platforms

Cross-compiling for Linux

I'm pretty sure it's possible to cross-compile for Linux on other OSes, seeing as everything is open-source, but I have never done it - and why would I want to? Linux is the friendliest to build on, so it's better to use it as a build environment.

Cross-compiling for OSX

I've had success using xchain to compile for OSX from Linux, but it's a really long and painful experience.

The steps involve:

Basically, information that's out there gets outdated almost as soon as it's published, so even if I did put out a step-by-step guide, it would be outdated in no time.

I remember problems with libstdc++ for C++ compilation. I haven't tried building a cross LLVM-GCC yet.

Compiling for both 32-bit & 64-bit means two separate compilations, unlike natively on OSX where you can just use xcodebuild to get fat executables.

In summary, not sure whether it's worth the trouble, it's long to get running, fragile, and involves a lot of hand-holding.

Cross-compiling for Windows

mingw and msys on Windows are basically 'GCC on Windows', which works fine most of the time, and even includes tools like bash, some Unix utils (even vim!), but let me state that once and for all: mingw on Windows is slow as fuck.

A much better solution, in fact, is to use mingw on Linux (or others) to produce Windows executables. Apparently, mingw-w64 is a more up-to-date version (fork?) that also supports building 64-bit Windows executables.

Debian testing & upper contains a mingw-w64 package that'll get you started.

Cross-compilation means:

In the next sections, I'll just assume we're cross-compiling from Linux to Windows using mingw-w64.

Autotools

Autotools philosophy

Autotools' principle is that:

Notes:

About out-of-tree builds:

I haven't yet found a case where ./autogen.sh or ./auto_gen.sh are actually needed - if you have recent autotools installed, it's fine.

Cross-compiling for Windows with Autotools

When running configure, use the --host=HOST option. There's also --build and others, but I'm confused about those and never had to use them except when cross-compiling GCC for another platform. There's a reasonably good explanation in the automake docs.

Example HOST values:

It's a good idea to cross-compile into a prefix so you don't mix native and foreign libraries. So most of the times, the build process might look something like:

mkdir build
cd build
../yourpackage/configure --host=i686-w64-mingw32 --prefix=$HOME/i686-w64-mingw32-prefix
make
make install

Side-note about temporary programs

Some packages compile executables that are run during the build process - such as tests, or libjit's opcode generator.

Sometimes, they'll be configured such as these will be built for the build architecture (Linux) instead of the host architecture (Windows), which is perfect.

But if they're not, chances are you'll get an error about some missing executable and find an .exe in its place.

And if you're really unlucky, the package's Makefile.am files will invoke directly util instead of util$(EXEEXT) - you might need to fix that too.

To be able to run .exe directly from Linux, on Debian you can do the following:

sudo apt-get install wine binfmt-support
sudo /usr/sbin/update-binfmts --package wine --install wine /usr/bin/wine --magic MZ

If you're lucky, the temporary programs will run just fine and everything will work out. If it doesn't, time to read the autotools docs to know how to generate those tools for the build platform, not the host platform.

If running these with wine messes up your terminal (when pressing enter you see '^M'), running reset should fix it in no time.

Cross-compiling for OSX with Autotools

Same principle, use --host=i686-apple-darwin11 or whatever your cross toolchain is.

CMake

CMake philosophy

CMake is basically a build file generator - it can generate for:

CMake usage

The cmake command takes as argument the folder of a CMake project, ie. where the CMakeLists.txt file lives (casing may differ).

That means out-of-tree builds are possible as well. To specify variables like the prefix, use the -D option. Example run:

mkdir build
cmake ../yourpackage -DCMAKE_INSTALL_PREFIX=$HOME/prefix
make
make install

In fact, out-of-tree builds are the recommended way to go for CMake projects in general. Sometimes the build folders are created inside the source distribution, but if it's just a dependency and not your project, it's a bad habit in my opinion.

Cross-compiling for Windows with CMake

As far as I can tell, CMake has no built-in cross-compiling support like autotools does. However, it does support toolchain files. Here's how a toolchain file for mingw-w64 might look like:

SET(CMAKE_SYSTEM_NAME Windows)

SET(CMAKE_C_COMPILER ${TARGET_ARCH}-gcc)
SET(CMAKE_CXX_COMPILER ${TARGET_ARCH}-g++)
SET(CMAKE_RC_COMPILER_INIT ${TARGET_ARCH}-windres)

SET(QT_MOC_EXECUTABLE ${MINGW_INSTALL_PREFIX}/bin/moc)
SET(QT_RCC_EXECUTABLE ${MINGW_INSTALL_PREFIX}/bin/rcc)
SET(QT_UIC_EXECUTABLE ${MINGW_INSTALL_PREFIX}/bin/uic)

SET(CMAKE_SYSTEM_PROGRAM_PATH ${CMAKE_INSTALL_PREFIX}/bin)
SET(CMAKE_FIND_ROOT_PATH ${MINGW_INSTALL_PREFIX})
SET(CMAKE_INSTALL_PREFIX ${MINGW_INSTALL_PREFIX})

SET(CMAKE_FIND_ROOT_PATH_MODE_PROGRAM NEVER)
SET(CMAKE_FIND_ROOT_PATH_MODE_LIBRARY ONLY)
SET(CMAKE_FIND_ROOT_PATH_MODE_INCLUDE ONLY)

Cross-compiling for OSX with CMake

Same principle, you can find toolchains on the internet.

Runtime dependencies

Find dynamically linked libraries on Linux

To know which dynamic libraries a Linux executable is linked against, one can simply run ldd EXEC

$ ldd ~/Dev/ooc/jit-tests/libjit1
        linux-vdso.so.1 (0x00007fff8c3fe000)
        libjit.so.0 => not found
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f9d5468a000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f9d54a5a000)

ldd is nice, because it respects LD_LIBRARY_PATH and works recursively - as long as it can find all the dependencies:

$ LD_LIBRARY_PATH=~/Dev/ooc/jit-tests/prefix/lib ldd ~/Dev/ooc/jit-tests/libjit1
        linux-vdso.so.1 (0x00007fff2fba2000)
        libjit.so.0 => ~/Dev/ooc/jit-tests/prefix/lib/libjit.so.0 (0x00007f99eaa1b000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f99ea64d000)
        libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f99ea42f000)
        libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f99ea22b000)
        libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f99e9f2a000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f99eacac000)

(Here, libjit was in a local prefix, and libm/libdl were dependencies of libjit, not the executable itself).

Find dynamically linked libraries on OSX

On OSX, use otool -L:

$ otool -L bin/pug
bin/pug:
  ~/Dev/pug/pug/deps/prefix/lib/libcollections.0.dylib (compatibility version 1.0.0, current version 1.0.0)
  /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1197.1.1)

A few notes about otool:

Find dynamically linked libraries on Windows

On Windows, one can use the Dependency Walker to figure out which DLLs are loaded - it'll even go so far as monitor the executable while it's running to know which DLLs are not linked against, but still loaded dynamically (with LoadLibrary).

On Linux, you can run i686-w64-mingw32-objdump -p executable.exe - the output is a bit verbose, so you might want to pipe that into grep 'DLL Name'

sh
$ i686-w64-mingw32-objdump -p bin/pug.exe | grep 'DLL Name'
        DLL Name: KERNEL32.dll
        DLL Name: msvcrt.dll
        DLL Name: USER32.dll

And that's it.