The topics covered in this work are spread over an introduction and four parts. Each chapter concludes with a brief review of the main results and formulae, and each part ends with an exercise section. Part One treats the fundamentals of modern stability theory. Part Two is devoted to the optimal control of deterministic systems. Part Three is concerned with problems of the control of systems under random disturbances of their parameters, and Part Four provides an outline of modern numerical methods of control theory. The many examples included illustrate the main assertions, teaching the reader the skills needed to construct models of relevant phenomena, to design nonlinear control systems, to explain the qualitative differences between various classes of control systems, and to apply what they have learned to the investigation of particular systems.