Another quick update to show some progress made this evening. I’ve plugged my inverse kinematics code into a quick and dirty motor controlling serial comms link to the Arduino and can now move the head of the delta robot to arbitrary XYZ coordinates. Here’s a video showing the plotting of slightly misshapen square (need some micro-switches to finished the calibration routine)
I should be able to get some more speed out of it by improving the Arduino comms link (it’s sending individual steps to the motors).
I’ll get around to doing a more descriptive post once I’ve got G-Code interpretation working (all the python code is done, just have to plug the bits together).
Here’s the little pre-amp circuit used inside the mic:
The op-amp used is a microchip MCP6241, also not shown is a 5v power regulator for the circuit (not quite built yet) and the wiring for the jack plug and power switch (which are both fairly self explanatory, with the exception of the jack plug used as a switch to connect/disconnect ground to the circuit and thus act as a secondary on switch).
Here are a couple of photos of the mic with the electronics almost finished:
Will post a final set of details once the mic is totally finished.
Started working on adding a texture to the generated models. To do this I’ve added an array of LEDs to the inside of the scanner that can be turned on to illuminate the model via the serial port on the arduino. A full colour image is captured for every laser spline image, then for every point along the laser spline the corresponding rgb value from the full color is found. The rgb values for all the splines are then mapped onto a 2D image creating the texture.
I need to brush up on some 3D modelling theory and the .obj and .mtl file formats and I should be able to create fully textured 3D models.
For now here is the image of the texture I captured for the nodding monkey model I’ve been using.