Ilyass's books

The Perfect Theory: A Century of Geniuses and the Battle over General Relativity
Martian Dawn and Other Novels
The Android's Dream
An Unwelcome Quest
A Calculated Life
Spell or High Water
Off to Be the Wizard
Brave New World Revisited
Tartarin De Tarascon
Startup Growth Engines: Case Studies of How Today's Most Successful Startups Unlock Extraordinary Growth
The 7 Day Startup: You Don't Learn Until You Launch
Have Space Suit—Will Travel
Calculating God
Stuff Matters: Exploring the Marvelous Materials That Shape Our Man-Made World
The Formula: How Algorithms Solve all our Problems … and Create More
What If?: Serious Scientific Answers to Absurd Hypothetical Questions
The Twilight Zone: The Original Stories

Ilyass's favorite books »

A small dogbone specimen containing one unique fiber is prepared in such a way that its surface provides a good enough level of grey in order to do Digital Image Correlation without the need of an additional speckle pattern. The purpose of the experiment is to evaluate the deformation field and observed damage mechanisms around the fiber (\(100 \mu m \) diameter) while the specimen is experiencing quasi-static tensile load.

The specimen is loaded in a microtraction machine adapted to perform the test under the microscope’s lens.

The test is stopped 8 times in order to take pictures using the amazing Olympus OLS4100 laser confocal microscope, it provides us with height information for each pixel and sweeps through a range of the z-height to provide a crisp clear definition at every corner of the picture.

First image Final image

On the first image, it is possible to distinguish the fiber but quite hard. We used the laser instead of white light to maximize pixels’ intensity. The white areas are actually reflecting the laser and will make the image correlation analysis impossible around those defects (glue? contamination? the specimen was cleaned though. EDIT: those are actually bubbles, voids, in the epoxy that are partially revealed by the polishing.).

On the final image, it is clearly possible to see that the fiber completely detached from the matrix. The material also seems to have shifted along the \(45^o\) line passing through the fiber. The white areas’ shapes and colors also changed erathically, which is why they cannot be considered in the region of interest during the analysis.

These experiments were possible with the help of Damien Texier and were done at École de Technologie Supérieur, Montréal.

If you liked this post, you can share it with your followers or follow me on Twitter!

At the USNCCM13, we discussed the fact that (currently) Peridigm only accepts Exodus II mesh files. The only software that generates Exodus II mesh files is the powerful Cubeit. Cubeit’s license can also be expensive, especially for someone who only wants to use the conversion to Exodus II feature to import geometries in PeridigmPatrick discussed the possibility of writing our own conversion script that converts a widely used free meshing format to Exodus II. We eventually decided to use Gmsh as an input for our script, which then converts Gmsh meshes to Exodus II.

The code is available here, everything is written in python so the only requirement is VTK >=5.8 with the python wrapper (sudo apt-get install python-vtk). We currently support triangles and quadrangles for 2D elements and tetrahedrons for 3D elements.



Mesh generated with gmsh Exodus geometry visualized with paraview


Mesh generated with gmsh Exodus geometry visualized with paraview

Get started

It was great working with Patrick on this project, I learned a lot about meshes and VTK. #notscaredofmeshesanymore

If you liked this post, you can share it with your followers or follow me on Twitter!

A 3D-printed mold was designed to hold 6 steel fibers (basically steel wire of 900\(\mu m\) arranged in two parallel lines of 3 fibers. The drawings show the side and front view of the mold, the red line represents one unique fiber. It is attached to the bottom of the mold using hooks that are directly 3D-printed in the mold.

Epoxy is then poured in the mold and cures around the fibers. The mold is then horizontally cut (following the scissor lines on the drawing) to provide several dogbone specimen, containing transversally embedded steel fibers. Click on the picture and move your mouse around to change the focus point and perspective.

One of the specimens was polished, then covered with light polarizing film on the front and back to reveal the photoelastic properties of Epoxy. If you ever need polarizing film, you can either buy some but it is quite hard to find large sheets. You can also get high quality filters by taking an old LCD computer monitor apart (that is what I did here), it is an easy operation that still needs you to be slightly careful (a tutorial to do so). Other surfaces were covered with black ink (Sharpy) to avoid non-polarized light diminushing the contrast of polarized light. It is then possible to qualitatively see the residual stresses in the specimen due to cooling after curing and the mismatch of coefficient of thermal expansion between the fibers and the matrix.

The density of fringes (isolines of same colours) qualitatively shows the stress concentration. The more colours there are in an area, the higher the stress concentration is. It is possible to distinguish the residual stresses due to the interaction with the mold and the residual stress around each fiber.

If you liked this post, you can share it with your followers or follow me on Twitter!

A transparent photoelastic polymer containing a single hole of \(500\mu m\) is loaded under tensile stress. The same experiment is done again using Digital Image Correlation (DIC) as an analysis method this time. The contour plot shown in the DIC experiment are percentiles of \( \epsilon_1 \) - \( \epsilon_2 \), where \( \epsilon_{1} \) is the deformation in the first principal direction and \(\epsilon_2\) the deformation in the second principal direction. The number of photoelastic fringes passing through a material point on the specimen’s surface is also proportional to \( \epsilon_1- \epsilon_2 \), allowing a qualitative comparison of both methods by looking at:

  • The fringes concentration
  • The number of fringes passing through a single material point and their direciton

This video compares both methods.

Both specimens’ dimensions were chosen according to the ASTM D632 Type V. Test speed: \(0.1mm/min\)

It is possible to see that the photoelastic specimen firstly reveals large fringes going through the whole specimen from left to right, that is probably because the specimen is experimenting torsion, due to the fact it was not perfectly straight between the grips. Once the whole specimen emits an almost uniform color (second 40), 6 petal fringes (circular) are cricling the hole.

Experiments done in collaboration with Rolland Delorme, at Ecole Polytechnique Montreal during November 2015.

If you liked this post, you can share it with your followers or follow me on Twitter!

When capturing images from a lab camera, you might find yourself with a bunch of .tiff images.

The first thing you will need is the ImageMagick package for Ubuntu. You can get it using:

sudo apt-get install imagemagick

This package contains a very convenient convert function, it can be used to convert a .tiff image into a .png while keeping transparency using:

convert inputfile.tiff -transparent white outputfile.png

You can then use this command to recursively treat any file in the folder with a simple grep and some batch scripting:

    for file in `ls | grep tif`
        convert "$file" -transparent white "${file}.png"
        echo "writing ${file}.png"

To use it, you simply need to put a bunch of .tiff files in the same folder, and execute the script. If your files are recorded as .tif and not .tiff, you will need to change it in the second line of the script.

Final tip: If any other file in the folder (it can be the script itself) contains the letters T-I-F-F in this order, it will be transformed into a .png even if it is a text file.

Source, more details:

If you liked this post, you can share it with your followers or follow me on Twitter!

Implementing nice looking plots in a \(\LaTeX\) document was harder than expected. I use R (through Rstudio) combined with ggplot2 (Grammar of Graphics plot) to plot my data and wanted a convenient way to insert my plots into Latex documents. I use something similar with Inkscape. It automatically sets the right font, properly writes Latex symbols and respects the proper font sizes.

The best solution I found is using the tikzDevice package for R and the tikz (available in the pgf package) package from LaTeX. TikZ plots will consist of vectors that will directly be coded into the LaTeX document so that there is no loss in image quality. First you need to install tikzDevice in R through install.packages("tikzDevice"). For the following example to work, you will also need to install ggplot2. Once you have both R packages, you can use this Rscript as an example:

	#For some reason, Rstudio needs to know the time zone...
	#Dummy data for the plot
	y <- exp(seq(1,10,.1))
	x <- 1:length(y)
	data <- data.frame(x = x, y = y)

	#Create a .tex file that will contain your plot as vectors
	#You need to set the size of your plot here, if you do it in LaTeX, font consistency with the rest of the document will be lost
	tikz(file = "plot_test.tex", width = 5, height = 5)
	#Simple plot of the dummy data using LaTeX elements
	plot <- ggplot(data, aes(x = x, y = y)) + 
		geom_line() +
		#Space does not appear after Latex
		ggtitle( paste("Fancy \\LaTeX ", "\\hspace{0.01cm} title")) +
		labs( x = "$x$ = Time", y = "$\\Phi$ = Innovation output") +
	#This line is only necessary if you want to preview the plot right after compiling
	#Necessary to close or the tikxDevice .tex file will not be written

The output provided by this script in R looks like this:

As you can see, the LaTeX codes are clearly visible. The font is R’s default font for now. If you check the folder where you sourced your file, you will find a test.tex file(you can check its content for this specific case by clicking on it) which contains the plot information as vectors. Every line, word or symbol is included as latex instructions in this .tex file. You should now create a .tex (any name will do) in the same folder your plot_test.tex file is in and use this simple LaTeX code to implement the image in it:

	%The package tikz is available in pgf

		%Do not try to scale figure in .tex or you loose font size consistency
		%The code to input the plot is extremely simple
		%Captions and Labels can be used since this is a figure environment
		\caption{Sample output from tikzDevice}

The result should look like this. As you can see, LaTeX symbols are now properly written, the font is also similar to the rest of the document and the plot can be inserted in your LaTeX document !

A problem you might still have: the font size of the plot title and axis legend and their positions. You can easily modify the sizes using the rel() function to scale the font size of each element. To modify the position of the axis legends or title plot, you can use the vjust and hjust parameters for vertical and horizontal positioning. R will not prepare your plot for the LaTeX font and sizes, so you will probably need to move the title and axis titles or they will probably touch the plot itsel at some point (visible in the previous LaTeX plot). To correct it, use something similar to this:

        plot + theme(plot.title = element_text(size = rel(1), vjust = 0), 
        axis.title = element_text(size = rel(0.8)),
        axis.title.y = element_text( vjust=2 ),
        axis.title.x = element_text( vjust=-0.5 ))

If you liked this post, you can share it with your followers or follow me on Twitter!

After the Windows 10 update, Miktex refused to download new packages and kept signaling a Windows API Error 2. The worst part was that the uninstall file for Miktex was nowhere to be found, which made it impossible to properly uninstall it using Windows Program manager. On the machine’s Admin account, Miktex was not even reported in the Program manager anymore.

A simple and efficient fix that worked for me is to rename you current Miktex folder Miktex_old, install Miktex where the old install was and simply delete the Miktex_old folder, and that’s it !

This trick might also repair some of the other numerous bugs Miktex meets on a Windows system (especially 64bits).

If you liked this post, you can share it with your followers or follow me on Twitter!