A Review Of Haptic Feedback Teleoperation Systems - ISIR/UPMC

Copy and paste this link to your website, so they can see this document directly without any plugins.


haptic, feedback, force, with, vol., systems, teleoperation, IEEE, that, from, been, have, tool, used, forces, manipulation, This, where, Int., users, virtual, user, Conf., system, this, Robot., micromanipulation, Trans., control, position


A Review of Haptic Feedback Teleoperation Systems
for Micromanipulation and Microassembly
Aude Bolopion and Stéphane Régnier
Abstract—This paper presents a review of the major haptic
feedback teleoperation systems for micromanipulation. During
the last decade, the handling of micrometer-sized objects has
become a critical issue. Fields of application from material science
to electronics demonstrate an urgent need for intuitive and flexible
manipulation systems able to deal with small-scale industrial
projects and assembly tasks. Two main approaches have been
considered: fully automated tasks and manual operation. The first
one require fully pre determined tasks, while the later necessitates
highly trained operators. To overcome these issues the use of
haptic feedback teleoperation where the user manipulates the
tool through a joystick whilst feeling a force feedback, appears
to be a promising solution as it allows high intuitiveness and
flexibility. Major advances have been achieved during this last
decade, starting with systems that enable the operator to feel
the substrate topology, to the current state-of-the-art where 3D
haptic feedback is provided to aid manipulation tasks. This paper
details the major achievements and the solutions that have been
developed to propose 3D haptic feedback for tools that often lack
3D force measurements. The use of virtual reality to enhance the
immersion is also addressed. The strategies developed provide
haptic feedback teleoperation systems with a high degree of assistance and for a wide range of micromanipulation tools. Based on
this expertise on haptic for micromanipulation and virtual reality
assistance it is now possible to propose microassembly systems for
objects as small as 1 to 10 micrometers. This is a mature field and
will benefit small-scale industrial projects where precision and
flexibility in microassembly are required.
Note to Practitioners—This paper ismotivatedby the urgent need
of intuitive and flexible manipulation systems able to deal with assembly tasks on the microscale. A new and promising solution is
presented here; teleoperation with force feedback, where an operator uses a joystick to control the tool at the microscale, while experiencing interaction forces between the tool and the environment.
Feedback assistance to the user using attractive and repulsive force
fields is also be proposed to help achieve the assembly task. Examplesaregiven in thispaper to illustrate this approach.Thepresented
techniques can be readily applied to most micromanipulation systemstoperformadvancedmicroassemblytasks.
Index Terms—Haptic feedback, microassembly, micromanipulation, teleoperation.
Manuscript received October 08, 2012; accepted January 29, 2013. Date of
publication March 08, 2013; date of current version June 27, 2013. This paper
was recommended for publication by Associate Editor P. X. Liu and Editor
K. Bohringer upon evaluation of the reviewers’ comments. This work was supported by the French National Agency of Research through the NANOROBUST
A. Bolopion is with the Department Automatique et Systèmes Micro-Mécatroniques, CNRS, Institut FEMTO-ST, 25000 Besançon, France (e-mail: aude.
S. Régnier is with the Institut des Systèmes Intelligents et de Robotique, Université Pierre et Marie Curie, 75005 Paris, France (e-mail: regnier@isir.upmc.
Digital Object Identifier 10.1109/TASE.2013.2245122
D URING the last decade, interest in microassembly has in-creased dramatically across a wide range of application
fields from material science to electronics [1]. Robotic devices
based on: thermal actuation, shape-memory alloys, and piezoelectric or electrostatic principles have been developed to enable
precise movements [2]. Several tools have been proposed for
performing 3D operations, such as cantilevers and grippers [3],
[4]. Automated tasks have been implemented for the assembly
of objects the size of hundreds of micrometers [5]. However,
this solution is valid only in a highly controlled environment
[6], which limits the industrial applications because it makes
the operation time-consuming and often expensive. In addition,
it must be applied to large-scale projects where a given predefined task must be performed which is often not feasible when
dealing with the assembly of novel products, such as innovative
Micro Opto Electro Mechanical Systems (MOEMS) or Microelectromechanical Systems (MEMS). On the microscale, most
of the projects are small-scale industrial development or novel
protocols. Proposing a fully automated setup for each assembly
required can be time-consuming or even inefficient, as most
of the protocols are not completely defined before the operation. The user’s expertise and capacity to adapt the manipulation protocol to environmental disturbances and the peculiarities of the task is of the utmost importance to ensure the success
of the assembly. Teleoperated tasks, where an operator manually performs the assembly by controlling the robotic system
through a joystick, is thus a widely used solution [7]. However, only highly skilled operators can perform complex assemblies because the objects and tools are fragile, the systems
are highly sensitive to environmental conditions, and the visual
feedback is limited. Therefore, assistance must be provided in
order to enable a higher number of users to perform the operations. Haptic feedback entails providing operators with a force
feedback through the control joystick used for the manipulation
(Fig. 1) and is a promising solution [8], [3]. At the microscale
both theoretical works [9]–[11] as well as the conception of
systems dedicated to micromanipulation [12]–[14] have been
This paper presents a review of the main haptic feedback
teleoperation systems for micromanipulation. The major issues
that must be addressed in order to obtain a usable system for
small scale industrial projects are highlighted, and the solutions
proposed in the literature are presented. In particular, the problems induced by high scaling factors and time delays on the
performance of haptic coupling schemes are addressed. Solutions are proposed for providing haptic feedback on sensor deprived systems or systems where only part of the information
is available. Examples of haptic feedback rendering are provided for classical micromanipulation tasks, such as pushing or
1545-5955/$31.00 © 2013 IEEE
Fig. 1. Haptic feedback teleoperation system. The operator control the position
of the tool by controlling the position of the haptic device handle . Interaction forces applied on the tool are transmitted to the user as a haptic force
pick-and-place. Additional methods of assistance using virtual
reality are presented.
This paper is organized as follows. In Section II, the major
historical teleoperation systems are reviewed. The issues that
must be faced in order to improve the performance of these early
systems are presented in Section III, and solutions developed
in the literature are reviewed. Haptic feedback teleoperation
systems for micromanipulation tasks are given in Section IV,
and Section V presents solutions incorporating additional assistance. Section VI concludes this paper.
The first use of teleoperation systems for micro- and
nanoscales applications was published in [15] in 1990. The
goal was to make a system able to reproduce the movements of
the operator and scale them down to control a manipulator. It
should also be able to reproduce the phenomena occurring at the
microscale through visual and haptic feedback. However, this
first implementation only gave access to a visual feedback, and
information about forces was given by visual indications, not
haptically. The first teleoperation system with haptic feedback
appears in [16]. The haptic device is linked to a scanning tunneling microscope. Users control the in-plane displacement of
the tip of the microscope. The vertical movement of the handle
of the haptic device follows the vertical movements of the tip
so that users can “feel” the topology of the substrate. However,
a lot of noise and hysteresis limits the possible applications.
Since these two first works, teleoperation systems with haptic
feedback have been mainly developed forAtomic Force Microscope (AFM)-based manipulation (Fig. 2). Two main reasons
justify this choice: the AFM is one of the most commonly used
tools for manipulation of micron-sized objects and, more importantly, it is one of the few that enable force measurement, which
is a critical issue to provide haptic feedback. The first system
is presented in [17]. Only one degree of freedom is available,
and users control the in-plane position of the tip using a mouse.
They can feel the repulsive forces when a force is applied on
the substrate by the tip, as well as the attractive forces when the
tip is lifted away from the substrate [18]. Simple experiments
have also been performed in [19]. In particular, tasks of picking
up spheres by adhesion and releasing them by rolling have been
realized. Only vertical forces are transmitted, and the use of a
piezoresistive sensor limits the resolution of the measured force.
These first systems makes it possible to “touch the microworld,” but not to feel interaction forces between the tool
and an object, such as the grasping force applied by a gripper.
They are thus not effective at performing manipulation or
assembly tasks with controlled interaction force. The reasons
for these limitations are detailed in the next section.
Fig. 2. Haptic feedback teleoperation systems, Institute of Industrial Sciences,
Tokyo [20].
Fig. 3. Direct force feedback coupling scheme. Adapted from [21].
The limited application of these systems is due to two main
reasons. The first one is the stability issue. The scaling factors
introduced tomatch variables in themacro and themicro-worlds
introduce instabilities. In addition, time delays that occur while
dealing with simulated environments or when using vision sensors to compute the haptic feedback induce instabilities. The
second reason is the incomplete force measurement. The force
applied on the tool is most of the time deduced from the measurement of the tool’s deformations. However, it is usually limited to 1D or 2D force sensing, depending on the geometry of
the tool, and it provides only limited information on the interaction force between the tool and the object.
A. Stability for High Scaling Gains Coupling Schemes and
Time Delayed Systems
A detailed analysis of the control schemes is of the utmost
importance to provide a haptic feedback of good quality. This
feedback should be stable, or else the user will need to compensate for large oscillations of the joystick that are disturbing, and
might cause damage to either the haptic device or the tool. It
should also be transparent, which means that the user can feel
with a high degree of fidelity the interaction forces applied on
the tool.
The control scheme depicted in Fig. 3 is the most intuitive
formulation to provide amplified forces to the operator [21]. Basically, the user operates a haptic device in the macro-world by
applying a force to impose the displacements of the slave
device in the micro-world (velocity ). The velocity of the
haptic device is scaled down by to be used as the input of
the nanotranslator. The force applied on the tool is amplified
by so that is sent to the user as the haptic feedback.
Solutions proposed for macro-sized systems are largely used
to ensure stability, such as in [19] where the Llewelyn criterion [22] is applied. However, the solutions proposed for macrosized systems may not be adapted to the specificities of the
micro-world. [23] presents an adapted passivity controller that
enables users to feel attractive forces. It is first tested through
simulations, before [24] demonstrates its application on a real
system. The homothetic factors play a major role in the stability
of the coupling schemes [17], [18]. A detailed analysis is provided in [21], which states rules to tune the coupling parameters to ensure stability and transparency. In particular, it highlights that a large amplification of the forces or large displacements (i.e., a large force scaling factor or small displacement scaling factor ) might induce instabilities. The ratio of
the scaling factors must be less than a factor that depends on the
haptic interface and the nanotranslator characteristics.
In addition to high scaling factors, a major source of instability is time delay. This issue becomes critical for applications
where long-distance teleoperation is involved, or for interaction
with complex virtual scenes or when vision-based sensors with
slow acquisition time are used to compute haptic feedback. This
issue is highly critical since at this scale objects experience high
accelerations due to their small inertia. In addition, the high sensibility of these systems to environmental conditions makes the
prediction of the position of the manipulated object difficult.
The stability of the system despite delays and modeling uncertainties is studied in [25] and [26], which respectively propose
a wave variable and a controller. However, the degradation
of the transparency while implementing these control laws is a
critical issue. Strategies to limit the time delays and acquisition
time of the sensors are necessary. In particular, high-speed vision sensing is presented in the next sections of this paper.
B. Limited Position and Force Measurement
The second major issue is the limited position and force measurement. The development of such systems faces a major obstacle: the lack of position and force feedback [2]. Sensors have
been developed [27], [28], but their integration into dedicated
tools increases significantly the complexity and the cost of the
tool fabrication.
AFM have been the first tools used for haptic feedback on
the microscale since they enable force measurement. However,
since forces are computed from the measurement of the cantilever’s deformations, only twomeasures are available: the vertical bending and the torsion [29]. To improve the haptic feedback, [30] analyzes the relation between the 3D force applied
on the cantilever and the measure of the deformations by taking
into account the direction of the cantilever movement. However,
this technique is highly sensitive to the noise measurement and
numerical errors during the computation of the force.
Another approach is proposed in [31], which uses a model of
friction between the tip and the substrate. The topology of the
substrate is assumed to be known, for example, from previous
AFM scans. This solution is promising to provide users with
information about the substrate, but it cannot be used for manipulations, as the interaction force between the object and the
tool cannot be determined.
In addition to AFM, more complex tools have been developed to perform advanced assembly tasks. In particular, microFig. 4. Haptic feedback of an approach-retract experiment. The cantilever is
approached from the substrate (time s). Users can feel interaction forces
and, in particular, the snap-in phenomenon (in the order of the nanonewton).
When contact is reached, a force is applied on the substrate by the cantilever
and users feel a repulsive force. The cantilever is then moved away from the
substrate. Users feel attractive forces until the cantilever is detached from the
substrate—adapted from [21].
grippers are commonly used [1]. Even if some of them offer
sensing capabilities (at the expense of a complex design) [32],
[33], most still lack force measurement capabilities [34], [35].
In particular, only a few grippers enable the manipulation of objects of less than 10 m with force feedback, and most of them
are only prototypes.
To overcome the lack of force-sensing, vision is a promising
solution [36], [37]. It is used to measure the deformations of the
tool [38]. Force is then estimated based on the mechanical properties of the tool [39], [40]. This solution avoids the complexity
of force sensor integration, while providing feedback to compute the haptic force. Strategies based on virtual guides, using
simulators or not, have also been developed to compensate for
the limited position and force measurement. They can be used,
for example, for micromanipulation under Scanning Electron
Microscopes (SEM), where closed-loop positioning tools are either expensive or have a limited bandwidth. In this case, the vision-based haptic feedback enables users to close the loop, and
compensate for the lack of closed-loop positioning units. This
technique is used in some of the applications detailed in the next
A. Haptic Feedback of Nanonewton Interaction Forces
Using AFM, the laser reflected on the cantilever makes it
possible to measure the forces applied on the tool and to get
a nanonewton force resolution measurement [41]. With an appropriate haptic coupling scheme, it is thus possible to render to
operators very weak interaction forces, such as attractive forces
between the substrate and a cantilever [21]. In particular, the
snap-in phenomenon occurring while approaching the tip from
the substrate, which is in the order of the nM, is transmitted to
users (Fig. 4). These systems enable a better comprehension of
interactions between the objects, and in particular the influence
of the nature of the substrate on the attractive force [24].
B. Virtual Guides
Instead of transmitting haptic forces that perfectly match
measured forces, it can be interesting to define haptic forces
Fig. 5. Virtual guide that assists the user while keeping the sphere at the extremity of the cantilever. The further the sphere is from the extremity, the greater
is the haptic force that tends to pull the user to the predefined position [43]. Note
that the insets representing the cantilever and the sphere are illustrations only.
In the real setup only a top view is available, and the sphere is hidden by the
cantilever. The haptic force is thus the only feedback that enables users to localize the sphere.
that will help users to perform a given task. These virtual
guides are used either to “pull” users into what is assumed to
be the correct position, or to “push” them away from areas
where the tool should not go, for example, to avoid collisions
[42]. The user is thus guided, but he/she can decide to override
this indication by applying a force greater than the haptic force
on the joystick handle. The definition of these virtual guides is
highly related to the task that must be done, and to the available
position or force measurement.
Virtual guides dedicated to rolling tasks are demonstrated in
[43], where 2D haptic feedback is provided. They assist users
to keep the sphere under the middle line of the cantilever and
at its extremity by providing them with information about the
position of the object under the cantilever (Fig. 5). In [44], the
two possibilities (faithful rendering of interaction forces or virtual guides) are proposed to pick up and place a microsphere.
Virtual guides assist users to pull the sphere until the desired
altitude and a repulsive force avoids any involuntary collision
with the substrate.
However, it is not straightforward to determine which haptic
feedback (faithful rendering of interaction forces or virtual
guides) is the best adapted. This depends on the goal of the
teleoperation. If a better comprehension of haptic phenomena
is wanted, a faithful rendering is helpful, whereas to perform a
given manipulation task virtual guides prove to be effective.
C. Haptic Feedback for Systems With Limited Position and
Force Sensing Capabilities
Some works have been proposed to use two AFM cantilevers
with protrudent tips to make a gripper [45]. To detect the position of the object, the cantilevers are used in dynamic mode to
ensure accurate measurements [46]. The cantilevers are excited
at their resonant frequency, and the amplitudes of the oscillations are measured. Adapted haptic feedback based on the measurement of these oscillations has been proposed in [44], where
virtual guides assist the user while aligning the cantilevers with
the object and while closing the gripper (Fig. 6). 3D pick-andFig. 6. Use of two AFM cantilevers to form a gripper: haptic feedback based
on measurement of cantilever oscillations [44].
Fig. 7. Haptic feedback for a pick-and-place operation using a sensor-deprived
microgripper. An asynchronous Address Event Representation silicon retina
and a conventional frame-based camera provide information about the relative
positions of the tool and the object to compute the haptic force [47].
place experiments of microspheres (diameter: 4–6 m) validate
the approach.
The complexity of the previous approach based on two AFM
cantilevers to form a gripper, which makes it necessary to align
each tip separately, makes it unsuitable for non-expert users.
Classical grippers are better adapted to pick-and-place tasks, but
they have onemajor drawback: they are usually sensor-deprived
since the integration of sensors increases the complexity of the
design and of the fabrication process. Vision is a commonly used
solution for sensing; unfortunately, the low update rate of the
frame-based acquisition process of current available cameras
cannot ensure stable haptic feedback at the microscale level,
where low inertia produces highly unreachable dynamic phenomena. A novel vision-based microrobotic system combining
an asynchronous Address Event Representation silicon retina
with a conventional frame-based camera is presented in [47].
Unlike frame-based cameras, recent artificial retinas transmit
their outputs as a continuous stream of asynchronous temporal
events, in a manner similar to the output cells of a biological
retina. The reduction of redundant information enables high update rates. The temporal precision of the asynchronous silicon
retina is used to provide a haptic feedback to assist users during
manipulation tasks, whereas the frame-based camera is used to
retrieve the position of the object that must be manipulated. This
approach is validated through an experiment on teleoperating
a sphere of around 50 m in diameter using a piezo-electric
gripper in a pick-and-place task (Fig. 7).
All the systems presented in the previous sections only provide haptic feedback based on available measurement. This is
Fig. 8. Virtual and augmented teleoperation systems.
direct teleoperation. Two other types of teleoperation exist: virtual teleoperation where the user interacts with a simulator, and
augmented teleoperation where he/she realizes a manipulation
on a real object, but additional assistance is provided through
the use of a simulator (Fig. 8).
A. Virtual Teleoperation
Virtual teleoperation has three main application fields which
are education, training, and evaluation. At this scale the objects
and the tools are fragile, and the systems are highly sensitive to
environmental conditions. Real systems are thus inappropriate
for tests since it is not possible to guarantee the same experimental conditions for two trials.
Simulators have been developed for educational purposes. In
[48], the benefit of haptic feedback and visual analogy for the
comprehension of nanoscale phenomena is evaluated. A combination of these two modalities proves to be effective in teaching
physics at the microscale to students.
Simulators can also prove to be efficient tools for training.
Contrary to education where the goal is to teach the students unknown phenomena, training is dedicated to operators that work
in the micromanipulation field. These simulators are then used
by novice operators to learn technical gestures to perform a
given assembly task, or by expert users to test different manipulation strategies. Virtual teleoperation systems have been
developed to feel substrates geometries [49], or to simulate indentation tasks [23]. Some simulators can be adapted to the experimental conditions. For example, in [50] the geometry of the
substrate is directly interpolated from real measurements. Several physical parameters can be tuned to change the physical
properties such as friction. This ensures a realistic haptic rendering, which is necessary for training.
The simulators are also used for the definition of the most appropriate coupling schemes. Different haptic couplings are compared in [23]. Since the simulation guarantees the same experimental conditions, the performance of each control scheme can
be evaluated.
B. Augmented Teleoperation
Augmented teleoperation systems make it possible to perform tasks on real objects, and benefit from additional information based on the simulator (Fig. 9).
The first augmented teleoperated system is proposed in [51].
The manipulation is semi-teleoperated: the operator controls the
overall operation, but some tasks are performed automatically
[52], [53]. Users can thus concentrate on the main task, while
the technical gestures are performed automatically.
Fig. 9. The nanoManipulator offers semi-teleoperated augmented reality teleoperation [54].
Fig. 10. Visual reconstruction of the scene for augmented teleoperation systems. (a) Representation of the deformations of the substrate [24], (b) Addition
of virtual information to assist users to avoid obstacles [42].
Most augmented teleoperation systems propose a visual reconstruction of the scene. This is of utmost importance since
on a microscale the visual feedback is limited. It is not possible
to get a cheap real-time visual feedback of objects of less than
hundreds of micrometers with depth information. The visual reconstruction of the scene makes it possible to add information
(Fig. 10). It can be used to highlight physical phenomena such
as the deformations applied on objects [55], [24]. Additional information can also be displayed to assist the user to perform a
given task. In [42] the optimal path, as well as areas that should
be avoided to prevent collisions between the tool and the objects, are represented.
In the case of remote teleoperation, for example, if the user
and the micromanipulation system are situated in geographically distant locations, the virtual reconstruction of the scene
makes it possible to limit the load of the data transmitted. In
[56], a 3D stereoscopic view of the scene is reconstructed based
on the position of the object and the tool derived from images
coming from a scanning electron microscope. Instead of transmitting full images, only two positions are sent between the two
geographically distant sites to provide a real-time visual feedback (Fig. 11).
All these systems provide a haptic feedback. The visual reconstruction of the scene, which can use a simulator to provide a more realistic rendering, enhances the intuitiveness of
the system. In addition, an audio display can be considered, as
in [57] where an audio representation of the contact between a
micro tactile sensor and the substrate is proposed. Audio feedback can also be combined with haptic feedback to enhance the
assistance provided to users [58], [59]. However, its use remains
rather limited on the microscale.
Since the first works developed in the 1990s, haptic feedback teleoperation systems have shown drastic changes. The
Fig. 11. Software architecture of a teleoperation system enabling micromanipulation from France of objects situated in Germany [56]. Information about the
relative positions of the objects and the tool is derived from SEM images and
transmitted to the remote teleoperation system. This is used to reconstruct a 3D
virtual scene and to provide haptic feedback.
first AFM-based systems made it possible to “touch the microworld” by providing the feeling of the substrate topology. Detailed analyses of the haptic coupling scheme performancemade
it possible to improve both stability and transparency. Based on
the determination of strategies to provide 2D or 3D haptic feedback despite the lack of force and position sensors, more complex applications have been proposed. Vision-based haptic feedback makes it possible to use a wide range of sensor-deprived
tools, and in particular the microgrippers that are present in
most microassembly platforms. Augmented teleoperation systems that use simulators to derive additional information improve the assistance provided to operators. Virtual teleoperation
systems enable the training of users, as well as testing manipulation strategies.
Based on all these developments on haptic feedback teleoperation for micromanipulation and virtual reality assistance,
this field is mature for microassembly tasks involving objects
as small as 1–10 micrometers. The strategies developed for
research laboratory needs are now ready for the transfer of
technology. The solutions reviewed throughout this paper
can be integrated to fulfill the requirements of an industrial
microassembly system. This can benefit small-scale industrial
projects where complex assembly tasks of objects whose size
ranges from a few micrometers to several hundreds of micrometers must be performed. This leverages the user expertise
by enabling operators to concentrate on critical issues of the
manipulation while assisting them to perform the assembly.
Several future directions can be foreseen to increase the effectiveness of such systems. In particular, specific haptic interfaces
should be developed. Most of the works presented in this paper
use commercially available haptic interfaces and none of them
are dedicated to microassembly tasks. To increase the effectiveness of such systems, specific interfaces, with adapted design
and haptic feedback specifications, should be conceived. Largescale user-based tests performed on industrial end users should
be performed to ensure a perfect match between the haptic feedback and the industrial needs. Most of the works presented in
this paper deals with AFM-based systems, as they were historically widely used. Future developments should concentrate on
providing haptic feedback for lower cost user friendly tools, to
efficiently address the microassembly industrial projects.
[1] M. Gauthier and S. Régnier, Robotic Micro-Assembly. New York,
NY, USA: Wiley, 2010.
[2] S. Régnier and N. Chaillet, Microrobotics for Micromanipulation.
New York, NY, USA: Wiley, 2010.
[3] H. Xie, C. D. Onal, S. Régnier, and M. Sitti, Atomic Force Microscopy
Based Nanorobotics, ser. Tracts in Advanced Robotics. New York,
NY, USA: Springer, 2011, vol. 71, Modelling, simulation, setup
building and experiments .
[4] T. C. Duc, G. Lau, J. F. Creemer, and P. M. Sarro, “Electrothermal microgripper with large jaw displacement and integrated force sensors,”
J. Microelectromech. Syst., vol. 17, no. 6, pp. 1546–1555, 2008.
[5] L. Wang, L. Ren, J. Mills, and W. Cleghorn, “Automated 3-D micrograsping tasks performed by vision-based control,” IEEE Trans.
Autom. Sci. Eng., vol. 7, no. 3, pp. 417–426, Jul. 2010.
[6] C. Onal andM. Sitti, “Visual servoing-based autonomous 2-D manipulation of microparticles using a nanoprobe,” IEEE Trans. Control Syst.
Technol., vol. 15, no. 5, pp. 842–852, Sep. 2007.
[7] S. Bargiel, K. Rabenorosoa, C. Clévy, C. Gorecki, and P. Lutz, “Towards micro-assembly of hybrid MOEMS components on reconfigurable silicon free-space micro-optical bench,” J. Micromechan. Microeng., vol. 20, no. 4, p. 20, 2010.
[8] A. Ferreira and C. Mavroidis, “Virtual reality and haptics for
nanorobotics,” IEEE Robot. Autom. Mag., vol. 13, no. 3, pp. 78–92,
Sep. 2006.
[9] J. E. Colgate, “Robust impedance shaping telemanipulation,” IEEE
Trans. Robot. Autom., vol. 9, no. 4, pp. 374–384, Aug. 1993.
[10] Y. Yokokohji, N. Hosotani, and T. Yoshikawa, “Analysis of maneuverability and stability of micro-teleoperation systems,” in IEEE Int.
Conf. Robot. Autom., 1994, pp. 237–243.
[11] Q. Zhou, P. Kallio, and H. N. Koivo, “Fuzzy control system for
microtelemanipulation,” Intelligent Automation and Control: Recent
Trends in Development and Applications, pp. 817–822, 1996.
[12] F. Fukuda, K. Tanie, and T. Mitsuoka, “A new method of master-slave
type of teleoperation for a micro-manipulator system,” in IEEE Micro
Robot. Teleoper. Workshop, 1987, pp. 204–208.
[13] S. E. Salcudean and J. Yan, “Towards a force-reflecting motion-scaling
system for microsurgery,” in IEEE Int. Conf. Robot. Autom., 1994, pp.
[14] P. Kallio and H. N. Koivo, “A 1 d.o.f. mini-telemanipulator: Design
and control,” in Int. Conf. Robot. Manuf., 1995, pp. 192–198.
[15] Y. Hatamura and H. Morishita, “Direct coupling system between
nanometer world and human world,” in IEEE Int. Conf. Micro Electro
Mechan. Syst., 1990, pp. 203–208.
[16] R. Hollis, S. Salcudean, and D. Abraham, “Toward a tele-nanorobotic
manipulation system with atomic scale force feedback and motion resolution,” in IEEE Int. Conf. Micro Electro Mechan. Syst., 1990, pp.
[17] M. Sitti and H. Hashimoto, “Macro to nano tele-manipulation through
nanoelectromechanical systems,” IEEE Ind. Electron. Soc., pp.
98–103, 1998.
[18] M. Sitti and H. Hashimoto, “Teleoperated touch feedback from the surfaces at the nanoscale:Modeling and experiments,” IEEE/ASMETrans.
Mechatronics, vol. 8, no. 2, pp. 287–298, Jun. 2003.
[19] G. Venture, D. S. Haliyo, A. Micaelli, and S. Régnier, “Force-feedback micromanipulation with inconditionally stable coupling,” Int. J.
Micromechatron., Special Issue on Micro-Handling, vol. 3, no. 3, pp.
307–327, 2006.
[20] M. Sitti and H. Hashimoto, “Tele-nanorobotics using atomic force microscope,” in IEEE/RSJ Int. Conf. Intell. Robot. Syst., 1998, vol. 3, pp.
[21] A. Bolopion, B. Cagneau, S. Haliyo, and S. Régnier, “Analysis of stability and transparency for nanoscale force feedback in bilateral coupling,” J. Micro—Nano Mechatronics, no. 4, pp. 145–158, 2009.
[22] F. Llewellyn, “Some fundamental properties of transmission systems,”
IRE, vol. 40, no. 3, pp. 271–283, 1952.
[23] S.-G. Kim and M. Sitti, “Task-based and stable telenanomanipulation
in a nanoscale virtual environment,” IEEE Trans. Autom. Sci. Eng., vol.
3, no. 3, pp. 240–247, Jul. 2006.
[24] C. D. Onal and M. Sitti, “A scaled bilateral control system for experimental one-dimensional teleoperated nanomanipulation,” Int. J. Robot.
Res., vol. 28, no. 4, pp. 484–497, 2009.
[25] M. Boukhnifer and A. Ferreira, “Wave-based passive control for transparent micro-teleoperation system,” Robot. Autonomous Syst., vol. 54,
no. 7, pp. 601–615, 2006.
[26] M. Boukhnifer and A. Ferreira, “ loop shaping bilateral controller
for a two-fingered telemicromanipulation system,” IEEE Trans. Control Syst. Technol., vol. 15, no. 5, pp. 891–905, 2007.
[27] S. Fahlbusch and S. Fatikow, “Force sensing in microrobotic systems—An overview,” in Int. Conf. Electron., Circuits Syst., 1998, vol.
3, pp. 259–262, vol. 3.
[28] F. Beyeler, S. Muntwyler, and B. J. Nelson, “A six-axis MEMS force
torque sensor with micro-newton and nano-newtonmeter resolution,”
J. Microelectromech. Syst., vol. 18, no. 2, pp. 433–441, 2009.
[29] J. Zhang, G. Li, and N. Xi, “Modeling and control of active end effector for the AFM based nano robotic manipulators,” in IEEE Int.
Conf. Robot. Autom., 2005, pp. 163–168.
[30] L. Liu, N. Jiao, X. Tian, Z. Dong, N. Xi,W. Li, andY.Wang, “Development of a haptic user interface for surface sensing and nanomanipulation based on atomic force microscope,” in IEEE Int. Conf. Nano/Micro
Engi. Molecular Syst., 2006, pp. 900–904.
[31] C. D. Onal and M. Sitti, “Teleoperated 3-D force feedback from
the nanoscale with an atomic force microscope,” IEEE Trans. Nanotechnol., vol. 9, no. 1, pp. 46–54, Jan. 2010.
[32] K. Kim, X. Liu, Y. Zhang, and Y. Sun, “Micronewton force-controlled
manipulation of biomaterials using a monolithic MEMS microgripper
with two-axis force feedback,” in IEEE Int. Conf. Robot. Autom., 2008,
pp. 3100–3105.
[33] [Online]. Available: http://www.femtotools.com
[34] B. Lopez-Walle, M. Gauthier, and N. Chaillet, “Principle of a submerged freeze gripper for microassembly,” IEEE Trans. Robot., vol.
24, no. 4, pp. 897–902, Aug. 2008.
[35] K. N. Andersen, D. H. Petersen, K. Carlson, K. Molhave, O. Sardan,
A. Horsewell, V. Eichhorn, S. Fatikow, and P. Boggild, “Multimodal
electrothermal silicon microgrippers for nanotube manipulation,”
IEEE Trans. Nanotechnol., vol. 8, no. 1, pp. 76–85, Jan. 2009.
[36] Y. Zhou, B. Nelson, and B. Vikramaditya, “Fusing force and vision
feedback for micromanipulation,” in IEEE Int. Conf. Robot. Autom.,
1998, vol. 2, pp. 1220–1225.
[37] M. Greminger and B. Nelson, “Vision-based force measurement,”
IEEE Trans. Pattern Anal. Machine Intell., vol. 26, no. 3, pp. 290–298,
Mar. 2004.
[38] A. N. Reddy, D. K. Sahu, N. Maheswari, and G. K. Ananthasuresh,
“Miniature compliant grippers with vision-based force-sensing,” IEEE
Trans. Robot., vol. 26, no. 5, pp. 867–877, Oct. 2010.
[39] M. Greminger, G. Yang, and B. Nelson, “Sensing nanonewton level
forces by visually tracking structural deformations,” in IEEE Int. Conf.
Robot. Autom., 2002, vol. 2, pp. 1943–1948.
[40] D. Cappelleri, G. Piazza, and V. Kumar, “A two dimensional visionbased force sensor for microrobotic applications,” Sens. Actuators: A.
Phys., vol. 171, no. 2, pp. 340–351, 2011.
[41] D. Haliyo, S. Régnier, and J. Guinot, “[mu] mad, the adhesion based
dynamic micro-manipulator,” Eur. J. Mech.-A/Solids, vol. 22, no. 6,
pp. 903–916, 2003.
[42] M.Ammi andA. Ferreira, “Robotic assistedmicromanipulation system
using virtual fixtures andmetaphors,” in IEEE Int. Conf. Robot. Autom.,
2007, pp. 454–460.
[43] A. Bolopion, B. Cagneau, and S. Régnier, “2Dmicro teleoperation with
force feedback,” in IEEE/RSJ Int. Conf. Intell. Robot. Syst., 2009, pp.
[44] A. Bolopion, H. Xie, S. Haliyo, and S. Régnier, “Haptic teleoperation for 3D microassembly of spherical objects,” IEEE/ASME Trans.
Mechatronics, vol. 17, no. 1, pp. 116–127, 2012.
[45] H. Xie and S. Régnier, “Development of a flexible robotic system
for multiscale applications of micro/nanoscale manipulation and
assembly,” IEEE/ASME Trans. Mechatronics, vol. 16, no. 2, pp.
266–276, Apr. 2011.
[46] H. Xie, S. Haliyo, and S. Régnier, “Parallel imaging/manipulation force
microscopy,” Appl. Phys. Lett., vol. 94, p. 153106, 2009.
[47] Z. Ni, A. Bolopion, J. Agnus, R. Benosman, and S. Régnier, “Asynchronous event-based visual shape tracking for stable haptic feedback
in microrobotics,” IEEE Trans. Robot., vol. 28, no. 5, pp. 1081–1089,
[48] G. Millet, A. Lécuyer, J. Burkhardt, D. Haliyo, and S. Régnier,
“Improving perception and understanding of nanoscale phenomena
using haptics and visual analogy,” in Eurohaptics. Berlin, Germany:
Springer-Verlag, 2008, pp. 847–856.
[49] S. Grange, F. Conti, P. Helmer, P. Rouiller, and C. Baur, “The delta
haptic device as a nanomanipulator,” in SPIE Microrobotics and Microassembly III, 2001.
[50] W. Vogl, B. K.-L. Ma, and M. Sitti, “Augmented reality user interface for an atomic force microscope-based nanorobotic system,” IEEE
Trans. Nanotechnol., vol. 5, no. 4, pp. 397–406, Jul. 2006.
[51] R. Taylor, II, J. Chen, S. Okimoto, N. Llopis-Artime, V. L. Chi, F.
P. Brooks, Jr., M. Falvo, S. Paulson, P. Thiansathaporn, D. Glick, S.
Washburn, and R. Superfine, “Pearls found on the way to the ideal interface for scanned-probe microscopes,” in Proc. 8th Conf. Visualization,
1997, p. 467.
[52] M. Falvo, G. Clary, A. Helser, S. Paulson, R. Taylor, II, V. Chi, F.
Brooks, Jr., S. Washburn, and R. Superfine, “Nanomanipulation experiments exploring frictional and mechanical properties of carbon nanotubes,” J. Microscopy Microanalysis, vol. 4, pp. 504–512, 1999.
[53] M. Guthold, M. Falvo, W.Matthews, S. Paulson, S. Washburn, D. Erie,
R. Superfine, F. Brooks, Jr., and R. Taylor, II, “Controlledmanipulation
of molecular samples with the nanoManipulator,” IEEE/ASME Trans.
Mechatronics, vol. 5, no. 2, pp. 189–198, Jun. 2000.
[54] R. Taylor, II, “Programming force feedback devices in computer
graphics systems,” in Course Notes for “Programming Virtual
Worlds”, SIGGRAPH, 1997.
[55] L. Fok, Y. Liu, and W. Li, “Modeling of nanomanipulation with an integrated teleoperated system,” in IEEE Int. Conf. Robot. Biomimetics,
2005, pp. 83–88.
[56] A. Bolopion, C. Dahmen, C. Stolle, S. Haliyo, S. Régnier, and S.
Fatikow, “Vision based haptic feedback for remote micromanipulation
in-SEM environment,” Int. J. Optomechatronics, vol. 6, no. 3, pp.
236–252, 2012.
[57] Y. Murayama, C. E. Constantinou, and S. Omata, “Intuitive micromanipulation with haptic audio feedback,” in IEEE Int. Conf. Comput.
Inform. Technol., Sep. 2004, pp. 907–911.
[58] A. Luciani, D. Urma, S.Marlière, and J. Chevrier, “Presence: The sense
of believability of inaccessible worlds,”Comput. Graphics, vol. 28, pp.
509–517, 2004.
[59] F. Marchi, D. Urma, S. Marliere, J. L. Florens, A. Besancon, J.
Chevrier, and A. Luciani, “Educational tool for nanophysics using
multisensory rendering,” in World Haptics Conf., 2005, vol. 0, pp.
Aude Bolopion received the Ph.D. degree in robotics
from the University of Pierre & Marie Curie, Paris,
France, in 2010.
She is currently a CNRS Researcher at the
FEMTO-ST Institute, Besancon, France. She has
been a member of the Biomedical Micro Nano
Robotics team since 2011. From 2007 to 2011, she
was a member of the ISIR micromanipulation team.
Her research interests are focused on teleoperation
and haptic feedback at the nanoscale, and on noncontact micromanipulation.
Stéphane Régnier received the Ph.D. degree in mechanics and robotics from the University of Pierre &
Marie Curie, Paris, France, in 1996.
He is currently a Professor with the Institute of Intelligent Systems and Robotics (ISIR), University of
Pierre & Marie Curie. He has been Head of the ISIR
micro/nanorobotics team since 2001. His research
interests are focused on micro- and nano-manipulation, teleoperation and haptic feedback at the
nanoscale, micromechatronics and biological cell

PDF Document reader online

This website is focused on providing document in readable format, online without need to install any type of software on your computer. If you are using thin client, or are not allowed to install document reader of particular type, this application may come in hand for you. Simply upload your document, and Docureader.top will transform it into readable format in a few seconds. Why choose Docureader.top?

  1. Unlimited sharing - you can upload document of any size. If we are able to convert it into readable format, you have it here - saved for later or immediate reading
  2. Cross-platform - no compromised when reading your document. We support most of modern browers without the need of installing any of external plugins. If your device can oper a browser - then you can read any document on it
  3. Simple uploading - no need to register. Just enter your email, title of document and select the file, we do the rest. Once the document is ready for you, you will receive automatic email from us.

Previous 10

Next 10