Parametric problems have been widely studied and many researches have been provided to reduce the cost of computations. Reduced order modelling (ROM) achieves this goal by performing and storing a sequence of pre-computations in an expensive "offline" stage, and utilises the stored data to make predictions of solutions for parametric problems in an "online" stage with low cost. The (POD-) Greedy sampling algorithm is a powerful tool to obtain those pre-computations in an optimal sense. Problems arise for conventional reduced order modelling when the system undergoes dynamic changes: first of all, a robust error estimate is needed for dynamic problems; moreover, a cost-effective procedure is required in the "offline" stage to generate the optimum set of sample points, such that the most representative reduced basis may be obtained, which would also keep the "offline" cost under control. In this thesis, a new POD-Greedy sampling algorithm which utilises a new error indicator will be presented. This error indicator aims to predict paths of the optimum maximum error convergence. The standard POD-Greedy approach requires exact solutions over the entire parameter domain when a-posteriori error estimate is not available, thus is not practical. Instead, the proposed POD-Greedy algorithm avoids computations of the massive number of exact computations by applying interpolation, so that the numerical efficiency can be improved. Another contribution is an "error in the error" indicator which drives the local adaptivity of interpolation sample grids. This indicator compares low and high order interpolation scheme to obtain the correct sequence of local h-refinement and Greedy iterations. Finally a nonintrusive Abaqus/Matlab code coupling technique will be presented in appendix to enable seamless integration of commercial software and Matlab source code in computations of exact solutions. The accuracy and feasibility of the proposed method will be experimented on varieties of cases.