I recently ran some tests using the Trotec Rayjet laser engraver to see what it could actually do and how I might exploit that to it’s fullest
The main experiments were with white 3mm acrylic. The Rayjet comes with a special commander interface that allows you to assign properties to colours in the drawing you want to cut/engrave.
The basic properties are as follows:
Power: 0 - 100% (how many watts will the laser use in a single pass)
Speed: 0 - 100% (how fast the laser head will move)
Cutting function: Cut or engrave.
All the tests used the engrave function.
I initially conducted the test on the right (see the image below). This test was to establish the cutting depth for each power setting. I kept the power constant at 100% and varied the speed. The notation I’m using is power/speed. The 6 tests, in order were:
|Power/Speed||Remaining height mm||Cut mm|
|100/100||Inconsistent result *|
*I tried several times to get this consistent, without success.
I suspect either a problem with the commander software or a colour problem in my art work.
Following the the depth tests, I attempted to run tests between 100/10 and 100/1 but the commander software only produced a single power setting, that applied across each colour.
In the end, I found that the most effective technique was to use a single colour for engraving (Black RGB 0,0,0) and to vary the grey level to adjust the height. This means that you decide how deep you want the maximum engraving cut to be by adjusting the power/speed setting, and then use 256 shades of grey to produce your engraving, which actually makes a lot of sense.
The series of gradient tests on the right showed that I could use a gradient to create a particular angle, or curve, depending on the setting. I tested with power/speed settings of 100/2, 100/4, and 100/6. In the end I decided that 100/5 was the best setting that just cut all the way through the acrylic, meaning that I could vary the black (with 256 shades) to get the exact depth I wanted. In fact, I should conduct another test showing steps in 10 percent increments.
My target was to create a 22.5 (or thereabouts) degree angle, in the cut. The tests on the right in the image below shows that 67.5% for the gradient location in illustrator gave me approximately 22.5 degree chamfer. The gradient location is effectively the position of the midpoint between the two gradient colours, at 50% it is halfway between both colours.
All in all I’m really impressed with the Rayjet, I’m going to miss it when I go back to Australia.
A little more experience with the control interface will teach me what I can and cannot do, as I’m sometimes frustrated by not knowing why certain colours dont engrave.
Test results for the engraving:
It’s been a long time since I made any updates on this blog. I’ve got some good news. My current generation of oribots, that are very similar in folded design to Oribotics [network] are in full scale production at the Ars Electronica Futurelab. Currently I’m one of the inaugural Australia Council artists in residence at the Ars Electronica Center in Linz.
My workshop here consists of a Dimension Elite 3D printer from Stratasys, and a Trotec Rayjet Laser engraver. I’m using the 3D printer to do direct digital manufacturing of 50 Oribotic blossoms, and the RayJet to engrave and cut paper, and polyester fabric.
The newest generation of oribots have a very robust folded membrane made from polyester fabric. It’s really very flexible, the fold patterns are forming well, and during actuation the fold is clearly defined, not corrupting, and if someone touches the membrane it wont be damaged, as was the case with the [network] generation of bots.
The video http://www.youtube.com/watch?v=1gMC1lEi10o shows a really quick test of the ultrasonic sensor in action. I prototype with arduino, but the actual PCBs are designed by Ray Gardiner, and are currently in production.
The following two diagrams explain the interaction concept that I am developing for the installation at the Ars Electronica Festival. The basic concept is “as above so below“, micro interactions with an individual flower have ripple effects out to the macro of the installation.
Through observation over years, the most common interaction gesture, in both children and adults, is to move a hand in front of the oribots “mouth”.
Hand push opens the flower
Hand pull closes the flower
(or vice versa)
Interacting with one oribot affects every other bot on the network – they are interconnected.
Interconnectivity, micro and macro interactions.
Each oribot is interconnected to the entire network of oribots. Interacting with one flower causes the entire bloom of oribots to actuate. In one flower each of the 1050 folds are actuated by the push pull action of the servo. In a whole network of oribots, one interaction will actuate tens of thousands of folds. This will create a complex moving image. Combined with the RGB lighting in each bot, the installation will be visually arresting. In a simple mathematical sense, a single (micro) interactions, causes tens of thousands (macro) effects.
Thanks to Sean from Hobby Scene in Australia who supplied the HXT900 servo for my current oribotics work. I highly recommend Hobby Scene as the service was quick, friendly and they had the best price, all great reasons to do business!
I’m using the servo because of it’s compact size, and high torque, speed, and lastly the price can’t be beaten. Working with other servos would cost more than 5 times as much.
I did burn one HXT900 by driving it constantly for 3 days, as a stress test. The gear train is still totally intact, and I haven’t deduced the problem yet, perhaps the electronics overheated. I’ll ask Ray for a diagnosis when he arrives. At 3 days it’s over 50,000 repetitions non-stop. I have seen some Futaba servo’s in the same size that claim to last over 300,000 repetitions. It might be so, but I wonder if they will last non-stop. I think I can get longer life out of the HXT900 by switching them off periodically to cool down, and not driving them constantly.
This project is an exploration into 3D printing. I’m using Shapeways.com for my printing. They have a good fast web service for 3D printing. Perfect for n00bs like myself as the forums are full of advice, the automated file system runs checks on your uploaded data an sends you back an email telling you what the problem is and where to find help to fix it. I must say just uploading a few files certainly teaches you something.
I modelled my [network] generation of oribots very small (32 x 32 x 43mm). The idea is to print the form as a complete assembly, including working live hinges. The completed model requires a bit of finishing to protect it from UV light, and to make the surfaces smoother, but overall a great way to build robots that might be very very difficult to make by hand.
Being an artist in this day and age is incredible. Visualisation and articulation are the keys in enabling work with collaborators, companies in the production of a work. You might be a painter but you buy your high quality paints, brushes and canvases from someone, at least there the communication is simpler, at least more refined by historical usage. In technological works, or even non technological works, the artist, armed with their “idea” has to find the path between their vision and the available solutions. The incredible thing is the increased availability of information and complex services.
This image below shows the first prints I have produced. Unfortunately, I made an error in the model, and the hinges dont work, they fused during printing, but the overall, I’m impressed with its quality and strength. I’ve requested that the shapeways production guys assess the live hinges in my model before printing this time. So hopefully in about a week I’ll have a new working model!
That’s an Australian 50cent coin, probably only useful as a reference if you are an Aussie, a ruler is probably better internationally The small one on the right is 43mm high (about 1.7 inches). The image is darkened to enhance the white details.
Functioning prototypes from the 3D printer.
This video (apologies for the low production values) briefly explains and shows the two sets of prototypes that were produced. The second prototype had correct clearances of between 0.3 and 0.5mm for the functioning printed hinges. Although the hinge rods at 1mm in diameter are too small to be strong enough for production. I have decided it is a broken micro-oribot embryo, an expression of the fragility of this work.
I’ve been toying with this idea for some time during my residency. The idea is to make a mould that can be used to press, or emboss, a crease pattern into a sheet of paper. The crease pattern I am using is possible to fold by hand, but requires many marks that affect the perfection of the folded form. I should note that any imperfection will soon result in a disfigured oribot. I consider this analogous to DNA folding. DNA proteins fold at an astonishingly fast pace, and one tiny error in the folding can result in serious disease in the life form. The same occurs with my oribotics, in that a small crease out of place will cause the damaged area to deteriorate more rapidly than an perfect area, like a disease, eventually requiring replacement (surgery) on the robot.
But I digress into conceptual artifacts… back to the point at hand.
The form I am visualising is in two parts. The two parts are mates to each other, where one side is indented, the opposite side is outdented. See image below. The final form will have the mountains and valley folds appropriately inverted. The modelling was difficult enough to get this form to work properly, but I have an idea about how to approach the mountain/valley modelling.
This picture is of a 3D model of a very tiny mould, the crease pattern (the embossing area) itself only measures 59 x 39 mm, meaning that some folds are only 2-3mm long (tiny stuff!).
I’ve contacted embossing companies, even micro-embossing companies, and I’m sure its possible, but the complications arise around the use of many tons of pressure and machinery required to do this. I think the ideal form is made from metal, and uses heat to imprint the creases onto a suitable material, in my case a paper thin synthetic material, or perhaps simply a very thin strong piece of paper.
I’m currently an artist in residence in Germany, in small village, living in a Künstlerdorf (translates directly artists-town). One of my projects here is to create a new oribot. I was inspired by a member of the public at ArtBots who reminded me of an idea I once had; to make an oribotic flower that has multiple levels within the blossom. So it seems pretty easy while thinking about it, glass of beer in hand. When you actually set about to make a 3d fractally recursive piece of moving art, then you have to take a few steps back and think about it a while.
Well I’ve only been thinking about this for the past few days. Its built on the oribotics [network] design. The crease pattern is called WB75 (super name I know), from its 75% scale reduction in its modular waterbomb crease pattern. I’m still very much in love with the aesthetic qualities of the WB75, and I like the idea of extending the genus of this particular species. See these videos (made with Blender) of my visualisations of the mechanics. The crease pattern isn’t there, I have not discovered a way to realistically fold the membrane in 3d software, but I have recently upped my skills in building these mechanical models, thanks to my attendance at the Blender conference in Amsterdam a few weeks back.
The colours are to help distinguish different parts, not an aesthetic choice.
This years Tanteidan was very good fun, and very informative. I met Miura-sensei, an expert in mechanised origami, and a wonderful inspirational man who’s work is renowned around the world. He shared some of the process in the development of his space projects.
Below you can see a small range of works that were exhibited during the convention.
This video shows the mechanics for the oribotic leaves. its a simple design but the neat thing here is how using Blender saves me building the model to actually test if it will work. I can be confident that this design will function as I need it to.
programmers brief- by matthew gardiner
To transform xml feeds into emotional states for the oribots.
Do this by gathering and storing xml data feeds, and subsequently digesting the data according to rules defined by keywords. The conceptual idea is that the oribots are feeding on information from the web. The digest program is actually ‘farming’ the information, and processing (digesting) the data into a suitable form for consumption by the oribot.
happy, sad, fear, surprise, angry, disgust, aversion, ambivalence +(more)
what is digestion?
Digestion is a process of analysing a body of text, collecting key words, and checking the emotional modifier (emo-mod) assigned to the keyword. The highest emo-mod value takes precedence and becomes the emotion for the article. To allow for a large peak in a single emo-mod value, there is a peak emotional response followed by an average response.
how often does digestion occur?
Digestion occurs at the soonest possible time after the collection of a new article for digestion from a feed. All feeds are periodically checked for new elements. The feed rate can be calculated over a period of time.
how are the emo-mods defined?
All keywords are assigned an emotion and a value modifier. Here is a comma separated list of examples: terrorist = fear + 10, baby = happy + 20, technology = envy + 3.
can emo-mods be changed?
Yes, the user interface on oribotics.net allows users to modify the emotion and modifier value for any keyword.
DATABASE TABLES AND SCHEMAS
feeds - keep track of feeds: urls of feed, access times.
articles- table to download article into
words- dictionary of words assigned to emotions
emotions - emotions and values
bots - links bots to a SINGLE feed, and its emotion
FEEDS SCHEMA feedID(int) title (text) feedtype(text) - flavor of feed, atom, rss, rss2 etc… url (text) last access(date) frequency to access (in minutes) (int)
ARTICLES SCHEMA articleID (int) - system id for article feedID(int) title (text) url (text) shortbody(text) fullbody(text) keywords (text) peak-emo(emoID) average-emo (emoID) datetime (date)
WORDS SCHEMA wordID(int) word(text) emoID(text) modifier (int)
EMOTION SCHEMA emoID(int) emotion(text) modifiervalue(int)
BOTS SCHEMA botID(int) feedID(int) emoID(int)
LOG feedID(int) emoID(int) emoValue articleID(int) timestamp (date time YYYY-MM-DD HH:MM:SS)
Automated functions (php code)
1. harvester (called by cron job)
Read all feeds periodically, check for, and collect new articles, then pass new articles to digester. The shortbody is collected directly from the feed, and fullbody is collected from the article url in the feed.
2. digester (called by harvester or cron job)
Deals with new articles.
Read article body. Extract keywords, run emotional scan on keywords, and determine peak emotional state, and average emotional state. Add record of the state in the log.
User interactive functions (php & flash code)
users can do the following:
0. view current emo-state for each robot
1. view the most recent digest that created the emo-state
2. see the list of keywords and emo-mods
3. click on a keyword to alter the emo-mod for words
4. add new words - make words keywords
5. add new feeds + including their personal blogs
6. alter the emotional associations for words and
edit pages for the following:
sources -> feeds
words -> emotional associations > modifer values
when queried by a bot request, outputs an xml file of current states
history of emotional output states
first taste peak - followed by deviation average tone of article
power - synchronisation - non chaotic - rms - as a measure of information
biorhythm - emotional physical intellectual