Paper 74 (In-Use track)

Semantic Diagnostics for Automated Factories

Author(s): Evgeny Kharlamov, Ognjen Savkovic, Martin Ringsquandl, Guohui Xiao, Gulnar Mehdi, Elem Guzel Kalayci, Werner Nutt, Mikhail Roshchin, Ian Horrocks, Thomas Runkler

Full text: submitted version

Abstract: Automation is one of the biggest trends in modern manufacturing known as Industry 4.0. Automation brings new challenges to diagnostics of equipment and monitoring of processes in factories. We propose to address the challenges with a novel rule-based monitoring and diagnostics language that relies on ontologies and reasoning and allows to write diagnostic tasks in a high level of abstraction. We show that our approach speeds up the diagnostic routine of engineers in Siemens: they can formulate and deploy diagnostic tasks in factories faster than with existing Siemens data-driven diagnostic languages. Moreover, via analyses of the computational complexity and experimentally, we show that our diagnostic language, despite the built-in reasoning, allows for efficient execution of diagnostic tasks over large volumes of industrial data. Finally, we implemented our ideas in a prototypal diagnostic system for automated factories.

Keywords: Ontologies; Rule Based Diagnostics; Automated manufactoring

Decision: reject

Review 1 (by Carlos R. Rivero)

This paper presents an extension to a rule-based diagnostics language using OBDA and deployed at Siemens.
I think this is incremental work with respect to [16] and worried about self-plagiarism. I would say more than 50% of the paper was already published at ISWC. I am not so sure if ESWC is the best venue for this type of paper.
---- After rebuttal ----
I acknowledge comments from the authors but do not change my opinion of the paper, which I still believe has been mostly published in ISWC. One of the important aspects that this paper differs in its application to smart manufacturing, but the user study is one of the main weaknesses of the paper, even acknowledged by the authors.

Review 2 (by Takahiro Kawamura)

This paper proposes Semantic Diagnostic Rule Language, which can abstract conventional rules of raw data to the representation of classes of those data.
Note that the paper is an extension of ISWC2017 In-Use paper.
The basic idea is not new but should be highly regarded for implementation and deployment to the Siemens factory in addition to the definition of formal semantics.
Although there is the limitation in length, the description of the diagnostic ontology should be described in the paper. It would give some insight into the rule abstraction.
Considering the purpose of this study, the usability or the maintainability should be evaluated rather than the system performance. These have not yet been conducted as described in the Conclusion.
Also, Related Work should include more enterprise activities than Semantic Web studies.
The difference from the ISWC paper seems to be adequate in length, but technically not enough.
- The readers would prefer that the paper follows the ordinary structure, i.e., Introduction should not have subsections, Related Work should be independent, etc.
I acknowledge the authors' comments, but unfortunately, do not change my score.

Review 3 (by anonymous reviewer)

A well written paper with a sound methodology. Rationale is well presented with a list of features required by the rule language and not supported universally in existing languages. The paper is well suited to the track as it addresses a real-life problem and the approach is applied to this with results provided. More evidence of the use evaluation carried out would be appreciated. How do the results of this initial evaluation affect future direction of the work?
Page 3 - PRS_X05 is included twice in the list of sensors. Should perhaps be PRS_X04
Use 'insight' not 'insides' - in a few places

Review 4 (by Ilaria Tiddi)

The paper presents a rule-based language to monitor diagnostics in smart factories, i.e. manufacturing industries relying on sensor observations to improve their efficiency (automatising tasks, monitoring, discovering issues etc). The work is a clear fit for the industry track as it shows how the approach is deployed in a real-world scenario in the Siemens factories. After an introduction of the problem of diagnosing tasks, the paper presents the signal processing language they developed followed by the implementation in the Siemens scenario. As a conclusion authors prove how the rule-based language allow to efficiently (in time) execute diagnostic tasks over large volumes of data. 
The idea of the paper is nice and shows an interesting and effective application of semantic technologies in real-world domains. The paper is also well written and has a somewhat okay narrative. What motivates my rejection is mostly two things: a number of obscure/confused parts (especially for less expert readers), and most importantly the lack of novelty w.r.t. the authors' previous ISWC paper. More details below.
Concerning the first issue:
- I find the introduction too dispersive : there should be an explicit problem-challenge-approach statement before section 1.1 going straight to the point, then passing to more details as in Sections 1.1-1.3. 
- a number of things should be clarified : e.g. in Section 1.2 I could not figure out what the following meant:
"diagnostic systems [...] are highly data-dependent: (i) data about specific characteristics like speed, capacity, and identifiers of individual sensors and pieces of equipment are explicitly encoded in SPRs and (ii) the schema of such data is reflected in the SPRs. " 
- what does authoring SPRs means and in what it is challenging? 
- In Section 1.3 you mention that you decided to adopt OBDA and mention why this was successful in the community, but it is unclear to me which are the benefits in your specific scenario? I am not saying it is a wrong choice, but it might be worth understanding how this could be any helpful
- at the end of page 6, should not "datasets of KB" rather be "datasets of \mathcal{K}" ?
- as a minor thing I would not use "on top of that" which is quite informal 
- I have a huge problem with the conclusion: an evaluation with five engineers is mentioned, saying it gave good insights of the usefulness of your work, but no other detail is given? I would suggest either showing it as part of the evaluation or putting it as plan for future work (especially do not give info on the results you have or you will be asked why you did not include it) 
On the second issue: I understand Section 1.3 states contributions and how the paper differs from the previous one, but the evaluation of the work does not seem to be focused on your new contributions. From my perspective the only novelty is a new use-case. Perhaps it might be worth considering joining the current work with the following one?
----- After rebuttal -----
I acknowledge the authors' response but my impression is that this lacks of convincing arguments against the main issue of the paper (lack of novelty). Authors seem to have simply repeated their contributions (as they had written them in their paper), and I still find that such new features are not the focus of the paper. So I do not feel like changing my opinion I am afraid. 

Review 5 (by Anna Tordai)

This is a metareview for the paper that summarizes the opinions of the individual reviewers.
This paper describes how a previously published approach is deployed in a real-world scenario in the Siemens factories. The reviewers have some questions regarding details of the approach, which are not addressed in the authors' response. The reviewers also point out issues with the evaluation study. The study reported in the paper does not include usability or maintainability but only system performance. In addition, the reported study does not match with claims made in the conclusion section.
Laura Hollink & Anna Tordai

Share on

Leave a Reply

Your email address will not be published. Required fields are marked *