The long void

January 16, 2016

It has been a long time since my last post. I am trying to approach the keyboard again and to put one word after the other to start again some writing activities.

In the long void of this blog, I got busy starting to study some new topics in the research area and to expand my toolbox. My purpose is to share here in the future posts some of these new areas of interest.

I give a short list of them right now with some comments and useful links for those interested to have a look at them.

In physics:

  • Helicon plasmas: I am still working on this kind of plasma generator using a helicon antenna and magnetic coils. They have the advantage to create high density and homogeneous plasmas in a compact volume. The problem is that the exact mechanism by which the electromagnetic wave ionizes the gas is not understood. The masters of this technique are Chen and Boswell. One well known application if for the generation of the plasma in the VASIMR engine.
  • Nuclear Magnetic Resonance in plasmas: it is here a very raw idea which needs a lot of work before having even a proof of concept. NMR is classically used in fluids and solids but not in plasmas where the density, several orders of magnitude lower, prevents to measure a clear signal. One idea  would be to use a fully polarized gas (while there is always only a small percentage of polarized ions in the human body for instance). More details on the existing ideas in the paper by.
  • Techniques to measure RF and DC electric fields in the sheath of a plasma in presence of a magnetic field. This is the topic given to one of my PhD students. I will not go today  into the intricacies of why we want to measure it but the main method involve the Stark effect and the Stark mixing. So back to the classics of quantum mechanics.

As for the toolbox, I have embraced two types of tools that I will have great difficulties to abandon now:

  • The Jupyter (formerly IPython) infrastructure: I say infrastructure and not notebook because it is more than that: the whole underlying machinery is really powerful and  well thought and has not yet started to give the measure of all its  capabilities. I use mainly a self-made version of Jupyter Hub (Windows compatible with own file manager) to my team access  through the browser to the notebooks and the Python kernels. The big advantage is that access to the experimental database and all associated processing libraries is  done through the notebook. Display of the data is also done over the notebook through javascript extensions. So the end user does not have to install any software or to manage libraries compatibility: all is done in one location on the side of the server. In addition, I can change the structure of the database as often as I want, if I keep the same access interface, there is no change for the user. The other big advantage is the (relatively) smoothless transition from data analysis to publishing: you can do almost everything at the same location and you keep the traceability of the data plotted in your articles. Since the whole team has access to the notebooks, they can clone them, comment and improve them.
  • Open hardware: in spite of all articles on the topic and the success of the Raspberry Pi and other Arduino, we still underestimate the potential of having cheap, fully documented hardware, and especially the potential in science. Because of the way science is funded, experiments increase in size and decrease in number at the expense of small and middle-sized experiments. Open hardware clearly makes it possible to achieve operations that were until now reserved to expensive hardware supported by expensive software. I have played with the Redpitaya, which is basically a Zynq 7000 SoC with Radio-Frequency ADC and DAC and whatwe can do is really incredible. Not only you can acquire or emit signals up to 150MHz but in addition, you have a small FPGA to implement your realtime operations. I would advise you to have a look at the Git from Pavel Demin to see all what we can do with it. If, in addition, you integrate this kind of board in your own internet of things, you greatly simplify your flow of data: the board can be controlled directly from jupyter (well, notebook, dashboard, control board, you start to see why it starts to be interesting), you access all your tools from one location accessible over internet from everywhere (thanks to your tablet or your phone). Imagine to be able to do your experiments from a  beach at the other side of the planet (well practically it is not authorized by the safety authorities and morally it is not accepted by your chief). With the imminent arrival on the market of new technologies like the Hololens, I think that the way to do experimental physics is about to change a lot.

In addition to that, we had some adventures with the Raspberry PI and Scratch which are worth sharing as well, but this is another story.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


%d bloggers like this: