Tuning your plots

I showed a student how I would improve a plot. The result is I think typical for what I would design. The details change for each paper and I do not have a specific “style” I wanted to emulate. I usually enjoy the before & after of other graphics, so here is mine:

PS: The data shown in the plots are preliminary

Thesis Art: Maria Sokotushchenko

I was a supervisor for Maria Sokotushchenko’s Master’s Thesis.

In her thesis-art, I artistically visualized the brain’s surprise response to a unexpected stimulus change. This response is sorted by how fast subjects responded (late on top, fast in the bottom)

 

The idea of “thesis art” is to inspire discussion with persons who do not have an academic background or work in a different field. The thesis is hidden in the drawer, but the poster is out there at the wall for everyone to see. You can find all past thesis art pieces here

Thesis Art: Edoardo Pinzuti

I was a co-supervisor for Edoardo Pinzuti’s Master’s Thesis. I finally came around to make this artwork with the text from his thesis.

He wrote an impressive matlab toolbox to analyze causality directions in time series based on Takens Theorem. The whole idea is about reconstructing embeddings of chaotic systems, with the Lorenz system (the one depicted in this artwork) being a simulation example in his thesis. Please find the DDIFTOOL toolbox here

The idea of “thesis art” is to inspire discussion with persons who do not have an academic background or work in a different field. The thesis is hidden in the drawer, but the poster is out there at the wall for everyone to see. You can find all past thesis art pieces here

Thesis Art: Judith Schepers

I was a supervisor for Judith Scheper’s Bachelor’s Thesis.


In this thesis-art, I visualized the guided-bubble paradigm used in a recent publication in the Journal of Vision. Judith generalized the paradigm to more than five bubbles, therefore, many more bubbles are visible in the thesis-art.

The idea of “thesis art” is to inspire discussion with persons who do not have an academic background or work in a different field. The thesis is hidden in the drawer, but the poster is out there at the wall for everyone to see. You can find all past thesis art pieces here

New Paper: The temporal dynamics of eye movements as an exploitation-exploration dilemma.

We just published a new paper in the Journal of Vision

The temporal dynamics of eye movements as an exploitation-exploration dilemma
Ehinger Kaufhold & König, 2018

The highlights:

  • We put eye movements as a decision process between exploitating the current view and exploring more of the scene
  • We use gaze-contingent eye-tracking to control the When and Where of eye movements
  • We find large effects of how long a subject fixates on their reaction time to continue exploring
  • We find large effects of the number of possible future target locations (Hick’s effect)

Check out the paper at the Journal of Vision
doi:10.1167/18.3.6

Ubuntu 16+: Recover ctrl+alt+bksp to restart X server

Often when developing with psychotoolbox or psychopy/opensesame your program crashes. And I often then have a full-screen window open and cannot click somewhere else. I then try to Alt+Tab and execute “sca” (screen close all) into the matlab console, with often mixed success. Sometimes restarting the computer is the last option. Instead of restarting, a useful command in older ubuntu versions was: STRG + ALT + Backspace => restart X server (=> restart GUI).

In order to activate this again use:

setxkbmap -option terminate:ctrl_alt_bksp

source on askubuntu

 

 

PS: I wrote this blogpost because I looked up this thing multiple times – now I know where to look 😉

Stretching the axes; visualizing non-linear linear regression

 

From time to time I explain concepts and ideas to my students.

Background

Often this pops up in a statistical context, when one has a non-linear dependency between the to-be-predicted variable and the predictor-variables. By transforming the predictors, relationships can be made linear, i.e. a logarithmic (exponential, quadratic etc.) relationships can be modeled by a **linear** model.

The idea

I have a very visual understanding on basis-functions / non-linear transformation of variables in terms of stretching / condensing the basis (the x-axis here). This can also be applied to the generalized linear model (here for logistic regression).

Imagine that the x-axis of a plot is made of some kind of elastic material, you can stretch and condense it. Of course you do not need to stretch every part equally, one example would be to stretch parts that are far away from zero, exponentially more than parts that are close to zero. If you would have an exponential relationship ($ y = e^x$) then $y$ would now lie on a straight line.

TLDR;

Imagine you have a non-linear relationship, by stretching the x-axis in accordance to that non-linear relationship, you will have a linear relationship.

An exemplary non-linear relationship:

We want to do $y = b_0 + b_1x $ but obviously a linear line does not fit well. We can do something called polynomial expansion, i.e. add more predictors which are simple transformations of the predictor $x$. i.e. $y = b_0 + b_1x + b_2x^2 + b_3x^3$

The trick comes here: We can interpret the new $x^3$ basis function as a stretching of the x-axis. I.e. the further we move out on the x-axis, the longer we need to stretch the parts (exactly by x^3 times)

This can be shown also for other functions:

Exponential

Logarithmic

Note that the logarithm is not defined for negative numbers

Quadratic

Note how the stretching can be negative, i.e. the original negative values are stretching/transformed to positive values

Using the trick on the y-axis

One can interprete **logistic regression** with the same trick:
$$ g{-1}(y) = b_0 + b_1*x <=> y = g(b_0+b_1x)$$
with $g$, the logistic (logit) function and $g^{-1}$ the inverse logistic function (invlogit)
$$ g^{-1} = \ln\frac{p}{1-p} <=> g = \frac{1}{1+e^{-x}}$$

Usually we would have some non-linear dependency on a probabilty of e.g. success. That means, with a low value of x, your success-chance are low. To model this kind of data, one can transform the y-axis using $g$ above.

Working remote – X11 Forward, Putty, Windows, Gateway

Sometimes I need matlab/rstudio/spyder but with access to the university network. One way is to run matlab/rstudio/spyder on the university computers, but get the X (=Graphics) display on my local windows machine.

Because there is a gateway in between, I first need to tunnel the gateway to a university working computer, then use a second putty session to ssh right through the tunnel directly to the target computer.

These are the steps I need to do:

– Putty: ssh to gateway.university:22;  Go to SSH-Tunnel and put source-port: 2222 (this is your local port you gonna target the second session). destination: remote-pc-that-runs-matlab:22

– Putty again: ssh to localhost:2222 with X11 forward enabled and “xming” installed

 

and perfect (but sometimes slow) remote-X11-forwarding. For the future I want to check out rdb to remotely control the session. This could be a quite useful in many cases because my programs are usually running anyway 🙂

[matlab] performance for-loops vs. vectorization vs. bsxfun

From time to time I explain my students certain concepts. To archive those and as an extended memory, I share them here. We also recently had some discussion on vectorization in our research group. e.g. in python and matlab. With the second link claiming for-loops in matlab are performing much better than before.

 

Goal

Show that for-loops are still quite slow in matlab. Compare bsxfun against vectorized arithmetic expansion in matlab against bsxfun

The contenders

  • good old for-loop: Easy to understand, can be found everywhere, slow
  • arithmetic expansion: medium difficulty, should be general used, fast
  • bsxfun: somewhat difficult to understand, I use it regularily, fast (often)

Comparisons

While demonstrating this to my student, I noticed that subsetting an array has interesting effects on the performance differences. The same is true for different array sizes. Therefore, I decided to systematically compare those.

I subtract one row from either a subset (first 50 rows, dashed line) or all rows of an [n x m] matrix with n= [100, 1000, 10 000] and m = [10, 100, 1000, 10 000]. Mean + SE

Three take home messages:

  • for loop is very slow
  • vectorization is fastest for small first dimension, then equally fast as bsxfun
  • bsxfun is fastest if one needs to subset a medium sized array (n x m >100 x 1000), but see update below!

 

Update:

Prompted by Anne Urai, I redid the analysis with multiplication & devision. The pattern is the same. I did notice that allocating new matrices before doing the arithmetic expansion (vectorization) results in the same behaviour as bsxfun (but more lines of code)

A = data(ix,:);
B = data(1,:);
x = A./B;

 

matlab code

tAll = [];
for dim1 = [100 1000 10000]
    for dim2 = [10 100 1000 10000]
        tStart = tic;
        for subset = [0 1]
            if subset
                ix = 1:50;
            else
                ix = 1:dim1;
            end
            for run = 1:10
                data = rand(dim1,dim2);
                
                % for-loop
                x = data;
                tic
                for k= 1:size(data,2)
                    x(ix,k) = data(ix,k)-data(1,k);
                end
                t = toc;
                tAll = [tAll; table(dim1,dim2,subset,{'for-loop'},t)];
                %vectorized
                tic
                x = data(ix,:)-data(1,:);
                t = toc;
                tAll = [tAll; table(dim1,dim2,subset,{'vectorization'},t)];
                % bsxfun
                
                tic
                x= bsxfun(@minus,data(ix,:),data(1,:));
                t = toc;
                tAll = [tAll; table(dim1,dim2,subset,{'bsxfun'},t)];  
            end
        end
        fprintf('finished dim1=%i,dim2=%i - took me %.2fs\n',dim1,dim2,toc(tStart))
    end
end

% Plotting using the awesome GRAMM-toolbox
% https://github.com/piermorel/gramm
figure
g = gramm('x',log10(tAll.dim2),'y',log10(tAll.t),'color',tAll.Var4,'linestyle',tAll.subset);
g.facet_grid([],tAll.dim1)
g.stat_summary()
g.set_names('x','log10(second dimension [n x *M*])','y','log10(time) [log10(s)]','column','first dimension [ *N* x m]','linestyle','subset 1:50?')
g.draw()

 

Scientific Poster Templates

I got asked for the design of my academic posters. Indeed I have templates in landscape and portrait and I’m happy to share them. In addition I can recommend the blog better-posters which has regularily features and link-roundups on poster-design related things.

In my newest poster (landscape below) I tried to move as much text to the side, so that people can still understand the poster, but it does not obscure the content. I also really like the 15s summary, an easy way to see whether you will like the poster, or you can simply move on. Maybe it even needs to be a 5s summary!

These are two examples posters based on my template.

Neat Features

Titles’ backgrounds follow along
Title background follows along
This is useful because you do not manually need to resize the white background of the text that overlays on the borders

Borders are effects, easy resizing
round corner resizing
The corners are based on illustrator effects, thus resizing the containers does not change the curvature. Before I often had very strange curvatures in my boxes. No more!

 

Download here

Portrait Equal Columns (ai-template, 0.3mb)

Portrait Unequal Columns (ai-template, 0.3mb)

Landscape (ai-template, 0.4mb)

Licence is CC-4.0, you can aknowledge me if you want, but no need if you don’t 🙂

4 of 6
123456