English

29 Pages

Gain access to the library to view online

__
Learn more
__

Description

Niveau: Supérieur, Doctorat, Bac+8

Discontinuous Feedback and Nonlinear Systems ? Francis Clarke ? ?Universite de Lyon, Institut Camille Jordan 69622 Villeurbanne, France (e-mail: ) Abstract: This tutorial paper is devoted to the controllability and stability of control systems that are nonlinear, and for which, for whatever reason, linearization fails. We begin by motivating the need for two seemingly exotic tools: nonsmooth control-Lyapunov functions, and discontinuous feedbacks. Then, after a (very) short course on nonsmooth analysis, we build a theory around these tools. We proceed to apply it in various contexts, focusing principally on the design of discontinuous stabilizing feedbacks. Keywords: controllability, discontinuous control, feedback, nonlinear theory, stabilization 1. INTRODUCTION Our interest centers throughout on the standard control system x?(t) = f ( x(t), u(t) ) a.e., u(t) ? U a.e., (?) where the dynamics function f : Rn ? Rm ? Rn and the control set U ? Rm are given, and ‘a.e.' is the abbreviation of ‘almost everywhere'. A control on some interval [a, b] of interest refers to a measurable function u(·) defined on [a, b] and having values in U .

Discontinuous Feedback and Nonlinear Systems ? Francis Clarke ? ?Universite de Lyon, Institut Camille Jordan 69622 Villeurbanne, France (e-mail: ) Abstract: This tutorial paper is devoted to the controllability and stability of control systems that are nonlinear, and for which, for whatever reason, linearization fails. We begin by motivating the need for two seemingly exotic tools: nonsmooth control-Lyapunov functions, and discontinuous feedbacks. Then, after a (very) short course on nonsmooth analysis, we build a theory around these tools. We proceed to apply it in various contexts, focusing principally on the design of discontinuous stabilizing feedbacks. Keywords: controllability, discontinuous control, feedback, nonlinear theory, stabilization 1. INTRODUCTION Our interest centers throughout on the standard control system x?(t) = f ( x(t), u(t) ) a.e., u(t) ? U a.e., (?) where the dynamics function f : Rn ? Rm ? Rn and the control set U ? Rm are given, and ‘a.e.' is the abbreviation of ‘almost everywhere'. A control on some interval [a, b] of interest refers to a measurable function u(·) defined on [a, b] and having values in U .

- existence theory
- optimal control
- time function
- nonlinear systems
- lyapunov function
- system
- ‘max' has
- hamilton–jacobi equation

Subjects

Informations

Published by | mijec |

Reads | 29 |

Language | English |

Document size | 1 MB |

Report a problem