Data-driven applications in computational science react in real time to their environment in a complex detect-analyze-response cycle. These computations can often be viewed as complex data flow graphs having components that are both data- and computationally- intensive, and requiring access to live data feeds and access to large-scale computational resources. A user may cycle through multiple graphs accessing data from sensors, instruments, databases, and large collections of files in the process of discovering new knowledge. This research investigates a programming model and framework for knowledge discovery in data-driven applications. Users program the system by declarative specification of detect-analyze-response behavior. Underlying the programming model is a continuous rule-based events processor and workflow orchestration engine organized as Web services. The research formalizes an abstract model of interaction and will map the higher-level conceptualization to the events processing and workflow runtime components. It demonstrates that the model supports a unique adaptive framework where knowledge gained from the computational and data analysis can be fed back to the data event streams. The approach is validated experimentally through quantifiable metrics and by its application to two model problems: severe storm prediction where a weather forecast is triggered based on data mining results from mining radar or model data, and adaptive resource management where hardware and software resources and environment data streams are monitored for on-the-fly resource requirements prediction.