In this paper, a new method for model reduction of bilinear systems is presented. The proposed technique is from the family of gramian-based model reduction methods. The method uses time-interval generalized gramians in the reduction procedure rather than the ordinary generalized gramians and in such a way it improves the accuracy of the approximation within the time-interval which the method is applied. The time-interval generalized gramians are the solutions to the generalized time-interval Lyapunov equations. The conditions for these equations to be solvable are derived and an algorithm is proposed to solve these equations iteratively. The method is further illustrated with the help of an example. The numerical results show that the method is more accurate than its previous counterpart which is based on the ordinary gramians.
Proceedings of the American Control Conference, 2013, p. 5530-5535