FYP Process 9: The Real Built, long process for making the head.

After a long time of not updating, I’ve started to built the actual head for the robot using EVA foam, which was my first time using it. The reason I am using EVA foam is that it is lightweight yet firm, easy to cut and form into shapes, and the joining of parts only requires contact glue (other glue will suffice but contact glue gives a better stick). Plus, I know that I will be building as and conceptualizing at the same time as I do not have any blueprint, every little details on the robot will be an impromptu decision I make during the building process, so low price of the EVA foam will allow me to work more freely so that I do not have to be afraid of making mistake and paying a huge sum of money on it.

To start it off, I converted the paper Husky head into EVA model with 10mm thick EVA as a base layer.

Planning and cutting the paper piece out to form a template with the consideration of how much EVA foam can bent so that the final product could form (10mm thick EVA foam bend drastically differently from paper so I have to keep this in mind when forming the paper template)

The foam is cut with slanted edge to form an angular joint with the pieces at the side, the curves were made by heating up the EVA foam and bent into place, all connections were made with contact cement to make sure it will last overtime.

the blade need to be sharp consistently for a high quality cut so the blade is sharpened after a few cuts.

Test piece were made and resin was applied to test the result of it and to strengthen the EVA foam to make sure they last as long as possible.

To learn and understand the materials fully, the resin-ed test piece were sprayed with black model paint to understand how they will look like with different variable (how the cut affect the outcome, how thick the resin affect the appearance and so on)

details were then cut and added to the head , all using EVA foam with knife, contact cement and hot air gun.

after the front details were added, I’ve dissected the head to add a magnet system that ease the maintenance in the future if it is required.

After which, eyes were added in to test the appearance of the head, the eyes were specially installed in a way that it will give an optical illusion that it is looking at the user no matter where the user is standing.

And then, more details were added to the front head.

The front were coated with more than 8 coats of resin and sanded which helps the paint to stick onto the head. (this whole process took about 2 weeks as the resin require time to be fully hardened between the layers. plus coating them while prevent dripping and sanding is a labor intensive process)

and then, the sanded head were airbrushed with primer (to help the paint stick to the head, and also help me to see the surface quality like checking of small bumbs)

After a few primer coat and sanding between the coats, the head were sprayed and mask to create a clean and beautiful spray job.

for the back of the head, details were built with laser-cut parts while LED were soldered and installed.

then, like the front, details were drawn and cut out from EVA foam and sticked onto the back of the head with contact cement. the LED on the back were also tested to see which colour combination looks the best.

As same with the front, the back were resin-coated with many layers and then sanded and primed, meanwhile, for the internal structure, speakers were soldered and installed inside the ear with properly placed velcro to help with maintenance in the future.

for the front of head, acrylic pieces were cut and melted with hot air gun to provide a protective cover for the eye, which two layer of tinted coat were sticked on it to darken the overall feeling and gives it a more compelling eye to the dog head.(and it affect greatly on the quality of the eye on camera, which I hope people will be taking photo/video of it during the FYP show.)

afterwhich, the dog head were masked and sprayed in many layers with model paint.

towards the final after the base colours were masked and sprayed, the corners were touched up with small brush. Water decals from toy models were added to the head to give it a more interesting finishing and make the finishing completed. As in my research, I found that small details is the key factor that differentiate a normal artwork and a insanely impressive one.

For the appearance of the head, lastly, it was sealed off with two thick coat of samurai lacquer spray paint and then airbrushed with a model grade matte clear coat to remove the shine from the lacquer paint (which is too glossy and look like it is a plastic toy.) small details like carbon-fiber vinyl were sticked to the side of the head to finish it off and gives different texture to the dog head.

The process of making the head from start to finish was long, but I’ve learnt many skills as this was my first time using EVA foam, so everything was a good lesson, even from the small details like “how to cut EVA foam properly” is an invaluable lesson for me. And throughout this process, I also learnt that what materials could or could not go with the other, and also how to cause the paint to stick onto the resin firmly and not peel off.

 

 

FYP Process 8: Explorer 27 Growing Up.

Few weeks ago, Explorer-27 was a short, dog-sized robot:

and after weeks of further development..

 

fast prototype using sticky plasty to form a tablet holder.

alot of time spent to cut the aluminium profile and attach them securely with nyloc nut and spring washers.

Afterwhich, I also built the new paper model dog head(Husky) 

And installed the LED eyes into this new paper Husky head.

Re-code the UI in Unity and also connect it to the Arduino system.

After the program in Unity works, I exported it into a working file and installed into the windows tablet to run the system program for the robot. The head is also attached to the new robot body.

Head with the UI animation.

As you can see, when the user moves the robot, the eye will follow the direction, this was done by transmitting data from one Arduino to the other using I2C protocol which connects multiple arduino together and then to the Unity program in the windows tablet.

 

 

FYP Process 7: Development and Upgrades.

Processor protecting case:
Parts Strengthening to make sure nothing come loose:
Experimental Head for the robot:
Plastic model building from the Japan Research Trip:

this is the link of the japan research trip.

Lattepanda with Unity and Arduino:

To experiment with  speed and delay of Lattepanda with unity to control arduino, which is not bad!

FYP Process 5: going against the Law? Hardware+Software

According to Isaac Asimov ,

the three laws of Robotics:
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2) A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Sounds really awesome isnt it, of course, there are situations that the robot will be forced to break these law, but again…. what if a robot is made to go against the first 2 law?

1) A robot may lightly injure a human when required to get the task done.
2) A robot may not obey all orders given to it by human beings.
3) A robot must protect its own existence and continue to execute the first and second law.

sounds really funny to me, doomsday robot? AHAHHAHA! sure, why not? ok maybe not. These laws are still in conceptualizing phase and if it made into the final project, it will be quite funny for those who read or knows the three law of robotics.

What was done until now after the last update?

re-structure robot:

As of the previous update(the one before the presentation) whereby I’ve construct the wheel and the base structure for the robot, I’ve noticed that the parts get looser and looser even when I tighten all nut and bolt in every connections, that was because of the vibration when i move the structure around. To counter this problem, I’ve dissemble the structure and change all bolt and nuts to self-tighten nuts (Nyloc nut) paired with a spring washer to reduce the rate of it getting loose overtime. As a loose part will get looser overtime through vibration, if this was not done at the early phase(now), it will be a real hassle at the end as I want the robot to have a relative fast parts replacement (so no joints should be joined with permanent solution like super glue) and this solution of the combination of spring washer and Nyloc nut is the most effective solution for its cost(time and money).

Nyloc Nut, to “lock ” the nut prevent slippage of the nut:
Image result for Nyloc nut

Spring washer, to prevent loosening through vibration

Related image

LED matrix animation:

The LED Matrix used is 4 x 1088 8 by 8 LED matrix which will be used as the animation for the eye of the robot, it will be attached to an arduino Nano which will be labeled as “Emotion Processing Unit” in this project to give it a more developed feeling to bring it away from the “Arduino = prototype” perception.

Parola Library code: Robot eye library code:animation made through a program that convert animation to LED matrix codes:

Mecanum wheel coding:

The Mecanum wheel is controlled through a 5 wire motor(VCC,GND,PWM,Dir and FG wire) so the motor driver was not required and I could wire them directly to an arduino. all the code was stored and run through an Arduino Uno whereby the main switch to control the motors were linked to a relay for the motor circuit. As mecanum wheel move unlike a normal wheel, special codes were used to move the robot in any direction without changing the orientation.

Image result for mecanum wheel control

 

Master-Slave Arduino System

There is a function of Arduino that is called the master and slave relationship, whereby a master arduino could communicate with other arduinos(up to 52) through just two analog pin (A4 and A5), this is really helpful for my robot as there was the saying of “Do not put all the egg in a basket” and this function will ease my repair if any parts gets faulty during the FYP show, I could just switch out the problematic parts without re-wiring the WHOLE 5 bazillion wire connections. Also, Arduino is a single thread micro-controller(it can only do one task at a single moment) so by splitting up different processing requirement into different arduino, I could run multiple thread without any lags between them. Finally for the upside of a Master-slave system, it is simply to solve the problem of hardware problem of the limited digital pins each arduino have(more arduino = more pins I could utilize without having more parts/coding like using a I2C Address chip.

wiring diagram

Things are getting technical here…
The Wiring Diagram, as messy as it looks but in normal language, a Master Arduino is controlled by a clickable analog stick (mode 1 to move, mode 2 to rotate), the data is processed and sent over through Analog Pin 4 & 5 to the MPU and EPU which “Movement Processing Unit(MPU)” that controls all the movements of the motor and “Emotion Processing Unit(EPU) controls the eye Animation on the LED Matrix. All of the Arduinos and motors will be paralleled connected to the 12V battery supply to power them up.

 

O.U.R.S Chief Engineer at work:

Sample Movement of Explorer 27:

 

FYP Presentation 1 OURS, Explorer 27 the lost robot

SLIDES and VIDEO compilation

The Slides were made with multiple programs,

Adobe After Effect for the animations(I got my animation template online and edited it.)
Adobe Premiere Pro for converting the animation into GIF
Powerpoint slides for the slides.

Why use GIF for animation?  Because a video in slides is not loopable and have too much constrains, so if I separate one animation into 2 GIF (Appearing and Looping) and use the “Animation- Appear” function in Powerpoint in a well-timed manner, I could create an illusion that it is looping.

Example:

This is an “Appearing Animation” that plays automatically because it is a GIF, and a “Appearing Animation” will keep “Appearing” because GIF is in a loop, so when stacked with “Looping Animation”…..

This is the “Looping Animation” that automatically appears (With PPT’s Animation”Appear”) 3.2 seconds after the slide starts, so it will look like the animation flow somewhat nicely.

and when put together in PPT, it will look like this, this GIF have the “Looping Animation” played 8 times, after 8 times it will “Appear” again but just an example~ in the ppt, it will loop indefinitely until I click to the next slide.

That was how the slide was made! and I learn After effect for it because most of the animation i used in this slide could be applied in the UI of the robot (Remember i have a screen that needs User Interface, yes, thats the one.)

Also, The Lab coat i wore during the presentation was customized with the logo of OURS to have a more cohesive feel to the theme.

 

What had i learn in the presentation:

Antropomorphism in robotics

main issue in the fyp:
Fake Ai -> Belivability -> Lost -> Antropomorphism

LEM Solaris (movie and book, george clooney) , I robot russian Stalker

FYP Process 4: Construction, Unity and Lidar

Aluminium profile measuring and cutting

20mm*20mm aluminium profile was used to construct the internal frame because it is easy to work with, lightweight and durable.

(Just for the record, I did not work in the dark but turned off the light for documentation purposes only.)

3D Modeling

This is the mounting adapter created in tinkercad to match the steel bearing, the motor and the wheel to the 20mm*20mm aluminium profile. the adapter for the suspension was created later as the suspension had not reach at the time of modeling the mounting adapter, however the 3 part fit nicely with just one try!

3d printing

Frame constructions and 3D printing part assembling.

many different screws, bolt and nuts were used to construct the frame and attach the wheel mount on the aliminium profiles.

Suspension Addition

The suspension is required as the Mecanum Wheel require friction to move in any direction, so the suspension does not only reduce the vibration, but to force the wheel to have traction with the floor at any point of time even on uneven surfaces.

Unity:
Animation done in Unity,

IlSample quest screen (there will be another screen to show the map or something)

These 3 were drawn in Adobe illustrator in separate layers and imported into unity to animate it.

Lidar:

Lidar uses infared laser to sense the distance in a 360 degree manner, it could give thousands of readings per second and the program will calculate them and map it out through a point cloud.

I bought a Lidar from China to try it out and the 1 I bought was C0602 

when using the program that was provided, the lidar works perfectly, but since I am going to use it in Unity, there are problems as Unity runs on C#(a programming language) while the SDK(Software Development Kit) runs in C++, although it is not impossible to write a bridge between the 2 language or write another code in C# to use the Lidar directly in unity, it was out of my capabilities to do so after trying for a week, and while researching for how to do it, i came across another Lidar that provided the Unity codes, however the code does not work on the current lidar I have……

SO, simple solution to this problem, either i ditch the idea of using a lidar OR I buy the Lidar with the Unity code and I put the current Lidar I have for sale online (at higher price than I bought it of course.) And of course, I chosed the 2nd option, to sell the one I have and buy the other 1 that provide a unity code. Carousell, my best friend.

 

FYP Soul – Why?

Why robot? Why Guiding robot? Why a whole system including a company, a backstory of how the Robot came to FYP exhibition?

First thing first, Why am I building a robot for FYP?

The answer is much more than just because I like it (and of course I do!)
Culturally, there are two opposing opinion of a robot – A western one which threaten us by stealing our and eventually bring us to annihilation, A Japan one which is seemed as hero and seemed to enhance the quality of life, since 16th century after the invention of karakuri puppet, the Japanese enjoys seeing something moving automatically and it is still really fascinating to see something that moves by itself now as we anthropomorphize the object unconsciously. I personally think that robotic will be the next advancement to the world as our computation power increase exponentially, the only physical way we could bring these newer technology into a good use is through something that uses technology and have a physical/tangible characteristic, just like a robot, albeit the term robot was loosely used, the general idea is similar- Physical object that moves without human through a set of pre-determined  protocol.

so, why specifically a guiding robot?

This is because I want to be of some use to our FYP batch, Guiding robot’s main purpose is to serve just one function- to bring the visitor to a student booth, which will increase the exposure of the student. Even if throughout the whole show duration, my robot only managed to bring one visitor to one student’s booth and the visitor enjoyed the booth, I would consider my FYP a success as I helped someone(visitor or student) to experience the FYP show in a slightly better way.

How about a lost guiding robot?

For now, I will be building a lost guiding robot which need the visitor to help to locate the student’s booth, although it seemed counter-intuitive to make a LOST Guiding robot as the worst thing that a guiding robot could do is to get lost, however when I go the opposite way(metaphorical), the end result still serve the same function, a robot which guide(narratively, it will be guided) the visitor to the booth. This way, the user experience/interaction with the robot will be different as they will feel like they will be helping the poor robot to find and complete it’s task and the visitor will feel like they have a sense of duty/accomplishment when finding the booth.

How does this work?
All of these stems from the word “Altruism“- the belief in or practice of disintegrate and selfless concern for the well-being of others. In this case, sacrificing the user’s own time to help a random robot.(which by logical thought, they do not need help and does not have feeling, however human is a complex thing and probably will not do things by logic)
As helping others will give us a sense of purpose and satisfaction, I will want to instill this idea into my project to make the user to feel like they are really helping the robot and feel the satisfaction when they complete the task(which in turn makes a happier visitor and a memorable experience for them.)

Why a whole system including a company, a backstory of how the Robot came to FYP exhibition?

This is to adapt the power of fictional narrative to change people’s attitude towards social change(robot in FYP exhibition) by using the method of narrative persuasion- a method that uses narrative transport to persuade us to change our mind, behavior and see the world differently and to put things into context even when the story is a fantastical.

 

 

Research to be done:
Interaction of human and robot
Social Robot
Programmed behavior
Slot machine reward system

 

 

https://topdocumentaryfilms.com/human-robot/

https://topdocumentaryfilms.com/inhuman-kind/

https://topdocumentaryfilms.com/robot-revolution/

 

FYP Process 3 – rebranding: O.U.R.S. + parts and systems

Changing Project title and Styling

As inspired from the trip from Japan, I decided to change the theme/styling of my FYP from a futuristic company(Hephaestus System Link 1 & Link 2) to a Tech Company in a Cyber Punkish Futuristic world, and after giving in depth thoughts about the branding of the tech company, I came out with “O.U.R.S” the acronym for “Omni Utility Robotic Systems”. There are many reason that this will be the perfect title for my FYP because the original idea of my FYP is to build something that could help out with all of  FYP student’s FYP exhibition, so “OUR” Robot fits the bill perfectly. Furthermore, I am changing the moving System from a simple robot that moves like a remote controlled car(only front and back with turn) to a Omni-Wheel System(Mecanum Wheel) whereby it could also move diagonal/sideways without changing the orientation. “Omni-Utility” could  both meant Omni wheeled robot with practicality and multi-utility robots with many functions built into it, “Robotic systems” was on the same idea with the previous “Hephaestus Systems” I do not want to only create a single robot, I wanted to create a system whereby that robot is only part of the system. So Omni Utility Robotic Systems (O.U.R.S) will probably be the name for my (Imaginary) company as well as the title for my FYP.

Logo Design:

With a Picture I took in Japan, Tokyo, Shibuya Crossing as “Mood Board”

Improved Unity UI(Slightly)

added a simple(sample) animation with the standard UI, buttons and back buttons work fine,(the actual animation is much smoother, but the GIF is slow)

Unity-Arduino Control

I learnt how to use unity to send Serial output to Arduino, although it was quite easy, but it took a really long time for me to figure out how it was done.

The arduino code was written in a way that it gives multiple values in a < > bracket as start and end marker, so that it could be changed to spin the motor and sms the student through arduino in the future, example of the serial communication <255,255,255,255,(phoneNumber),(UsersMessage)>

New Parts, New Systems

I had decided to build ONE robot instead of TWO robot at this point, so my total budget for building the robot just X2 instantly, this was the reason that I am changing the idea of using the kids vehicle to using Mecanum Wheel because it will be much more impressive.

This is a 11 inch screen with HDIM driver with a separate touchpanel, however the seller sent me the wrong usb wire so I cant test the touchscreen before they replace it, however the screen works well! although it is abit small, but it is good for my budget and power issue (my robot will run with internal battery so every component used in it must have their power consumption taken into consideration before everything.) 

The 2cm*2cm Aluminium profile reached and I measured the size i will probably need as the base frame of the robots.The largest single component in the robot will be the laptop which will be used to run the calculations within the robot so the minimum base size will be atleast the size of my laptop (the wheels will be protruding the frame so tha actual base will be about 15cm wider) I’ve cut the profile to form 47cm by 40cm(+~15cm for wheels) currently and I might make changes along the way.

 

For the SMS system which will be in the robot for the visitor to leave a feedback directly to the student, I’ve purchased a Sim-card along with a SIM900A module which enable Arduino to send SMS through the SIM card.
After sometime of trying the SIM900A module out, it does not connect to the service provider and I did the research about it and apparently this module only uses 2G from the simcard, and Singapore had phased out 2G service, so this module was unable to be used in Singapore, therefore I had to purchase another better simcard module which supports 3G/4G(and they are rather expensive but I had to do it…)

It could change direction then the direction change wire is connected to the ground. Powered by a 12V battery, Arduino Uno and a PAC9685 servo driver.

Mecanum wheel and the Motors will probably be the most expensive components in my whole robot(other then the laptop which I already have) as I purchased the best valued component, not the cheapest, an industrial grade aluminium CNC Mecanum wheel(10cm diameter and capable of handling 45KG load) to make sure they will last at least a year plus a month of exhibition time. The Motor is a high torque (8kg/cm *this is torque, not the load it could carry*) and can do PWM speed control, Speed Feedback and Breaking system built into it.
So although the price, I had gotten the best possible combination of parts with my given budget.

Have you ever wondered what a freelancer could do for you?

I thought of this as a joke(or potentially useful for my FYP), but what if I could have someone to draw my initial concept for me to have a starting point to develop my robot?

Since asking people to do it for free is not something which I prefer, I came to fiverr to find character design gigs just for fun, after asking multiple seller about what they could do, I found one potential one.

Of course I had no expectation to have something that I could use as a final, but it would be really cool if I can see a work created specially for my project… and I gave a super vague brief to the artist on purpose because I just wanted to see what kind of weird ideas I could get, and if there are any ideas good enough, I will just adopt into my robot!

So after half day or so, the seller delivered the drawing I requested.

It is really cute and definitely not something I would use, It look like a cardboard robot and is not what I would like to build, but well, it is really funny and refreshing to see my potential “robot” to go for picnic alone!

I shall just start thinking about the design on my own(afterall I had done too much reference research from existing artwork hence this weird idea of getting others to draw me a tailored robot.) and take advice from my friends around me and not do weird stuffs online.

Other Inspirations:

Is the time of the personal robot coming of age?

FYP Process 2 – Research in Japan

During the holiday, I went to Japan and did research as the first country that I would think of robotic will be Japan as they are really advance and I really love the styling of the way they built their robots (digitally or physically) as I grew up watching robot anime like Gundam.

Japan was awesome in technology, toys, and food!!

The Gundam outside Odaiba, Tokyo, I really like the styling of Gundams.

This is the The National Museum of Emerging Science and Innovation (Miraikan) of Tokyo.

the iconic location of Miraikan, i am amazed by the scale of it.

Upon entering the Museum, I was greeted by one of the robots, its really cool to see the “undressed” robot so that i can take a look at the constructions and the system they were using. cool stuffs!

 

this personal vehicle was made by Honda which could carry the person, much like an hoverboard but the sitting version, the visitor could try it buy i missed this activity if not i would really love to try one.

The main purpose I went to Miraikan, to see the famous Asimo, its really cute and i love the astronaut-styling it has. i must say it was really well made and it can dance and kick a ball, although they are pre-programmed, its still inspiring nontheless.

there was this touch screen which display the details of the globe over there and explain the stuffs on it, since I am also using UI in my robot, this was the futuristic style I had in mind, black background with bluish outline that has the advanced feel to it.

The history of Bi-pedal Robots

 

I was inspired by the space exhibit and thought that it was really cool to build my robot with a space+futuristic theme as I like that style alot.

many of the exhibits in Miraikan was screen-based installation which was not related to FYP other than those that was in the post, however they were all amazingly made and really impressive.


Internal of Miraikan.

and I really like Asimo so i bought one figurine which was rather expensive but it will work as a exhibit for my FYP booth.

 

 

FYP Process 1

After many different trials and setback, and after going through many online courses for Visual Studio which I initially wanted to use it to create the interface for the robots, Senior Kapi suggested to me to use Unity instead as it will be easier to do a nicer graphical interface(Visual Studio will just be hard to make beautiful UI from all I went through) and is able to do advance AI like path-finder which I might use it in the actual robot for navigation, Also, Unity is better in Particle system which I will use to create animation for the UI and so, I gave up the idea of using Visual Studio for the back-end programming, and while going through the tutorial of Unity, the scripting side within Unity was still in Visual Studio, so my effort in learning Visual Studio wasn’t wasted because it incorporated well into the Unity system.

To start off prototyping, my target: To create a page that is able to change the image and text by clicking any of the buttons placed within a scroll-able panel.

This will be required as the final product would enable the user to browse the student’s information and work, there should be a easy to use User Interface that allows them to do so.

The course which I went through:

https://www.udemy.com/c-sharp-visual-studio-app-development-tutorial/

The course which I am currently going through:

https://www.udemy.com/unity-master-video-game-development-the-complete-course

https://www.udemy.com/robotics-for-beginners-build-time-control-robot-from-scratch

Additional Tutorial.

https://unity3d.com/learn/tutorials/modules/beginner/ui/ui-scroll-rect

https://unity3d.com/learn/tutorials/topics/user-interface-ui/ui-scrollbar

https://unity3d.com/learn/tutorials/topics/user-interface-ui/ui-mask

 

Current Progress:

this is a gif, click to play

 

 

After thinking about the problem of the robot’s sensing(i must try not to have the robot to knock people down.) I was considering about Lidar for a long time but the price and the spinning part was the main concern that put me off, it cost around $150 for just one sensor which I am not confident if it will even work in my project, also because it is the Lidar which spins, it meant that it have to be placed on top of the robot and I am sure that it will break somehow due to visitor’s curiosity. After whole lot of research (I will still use ultrasonic/infared sensor like we did in class, but these are not reliable and cannot be the primary sensor) I came across something that seemed really promising… let me introduce to you, the old KINECT.

Why? Why Kinect? because..
1- it does not have moving part yet it is able to detect a wide area with roughly accurate distance.
2- it is easy to be concealed in the design of the robot as it could be place within the shell.
3- it is not expensive and I already have one unused Kinect at home, futhermore if I need more, the school have them.
4- USB operated means I do not need to find additional adapter/converter to use them.

Sounds good, the problem is, I don’t know how to use them and the documentation i found are not for what i need so i need to do further research.

………………………………………….

The following day, I went to school to loan the kinect from school and after some research which says it will not work on windows 10… I finally found some that mentioned that it is possible.

So TADAAAAA!

now I need to find a way to plug the data into Unity for then it will be possible to use it as a distance sensor..
(On the side note, I found that Kinect V2 is much better as it has wider field of view so I might consider that too.)

 

Children toy-Wheel System

Remote Controlled Children vehicle, it can take up to 30KG hence I bought this to use it as a base since it will be able to carry all the components.

Just for the record, this thing cost $103.51 with $2.99 for shipping fee from china (used EZ-Prime hence the cheap shipping), i think that the overall value is so much higher than the cost, I will take it apart and use the components within it and discard the BB-8 theme plastic case.