In this paper, we describe efficient ontology integration model for better context inference based on distributed ontology framework. Context aware computing with inference based on ontology is widely used in distributed surveillance environment. In such a distributed surveillance environment, surveillance devices such as smart cameras may carry heterogeneous video data with different transmission ranges, latency, and formats. However even smart devices, they generally have small memory and power which can manage only part of ontology data. In our efficient ontology integration model, each of agents built in such devices get services not only from a region server, but also peer servers. For such a collaborative network, an effective cache framework that can handle heterogeneous devices is required for the efficient ontology integration. In this paper, we propose a efficient ontology integration model which is adaptive to the actual device demands and that of its neighbors. Our scheme shows the efficiency of model resulted in better context inference.