RSE-HT2011

From Wiki of the E-Business and Web Science Research Group
Jump to: navigation, search

Research Seminar - HT 2011 

Universität der Bundeswehr München, Germany, HT 2011

Contact: Alex Stolz, alex.stolz (AT) unibw (DOT) de
Lecturer: Martin Hepp


Schedule (Group Research Seminar)

Oct 11, 2011 (3:00 - 4:30 pm, Room 36/1134)

Oct 18, 2011 (3:00 - 4:30 pm, Room 36/1134)

  • Presentations of their research topics by the PhDs and Post-Docs

Oct 25, 2011 (3:00 - 4:30 pm, Room 36/1134)

  • Table of contents due for students writing a BA or MA thesis (Kevin Siegerth and Markus Kramer)

Nov 8, 2011 (3:00 - 4:30 pm, Room 36/1134)

  • GoodRelations and Microdata / schema.org (Martin Hepp)

Monday, Nov 14, 2011 (3:00 - 4:30 pm, Room 36/1134)

  • OntoClean Reading Group (Bene Rodriguez and everybody)
  • Literature: to be added by BR

Nov 22, 2011 (3:00 - 4:30 pm, Room 36/1134)

  • Microdata
  • ToC and Related Work Review Seminar Thesis "Comparison of RDFa and Microdata Syntax for Exchanging Structured Data on the WWW" (Venera Pjetraj)

Nov 29, 2011 (3:00 - 4:30 pm, Room 36/1134)

  • Results presentation for students writing a BA or MA thesis (Kevin Siegerth and Markus Kramer)

Dec 6, 2011 (3:00 - 4:30 pm, Room 36/1134)

  • PhD presentation US
  • PhD presentation AR

Dec 13, 2011 (3:00 - 4:30 pm, Room 36/1134)

  • Results presentation:  "Comparison of RDFa and Microdata Syntax for Exchanging Structured Data on the WWW" (Venera Pjetraj)
  • PhD presentation AS

Dec 20, 2011 (3:00 - 4:30 pm, Room 36/1134)

  • Final presentations for students writing a BA or MA thesis (Kevin Siegerth and Markus Kramer)

Media

Steve Jobs: How to live before you die

Types of Research Contributions

The following is taken from the EKAW 2010 Call for Papers and is, imo, and excellent summary of the various types of research papers that are usually accepted at conferences and workshops:

A) Standard research papers

These are "standard" papers presenting a novel method, technique or analysis with appropriate empirical or other types of evaluation as proof-of concept. The main evaluation criteria here will be originality, technical soundness and validation.

B) In-use papers

Here we are expecting papers describing applications of knowledge management and engineering in real environments. Applications need to address a sufficiently interesting and challenging problem on real and large datasets, involving many users etc. The focus is less on the originality of the approach and more on presenting real, large-scale and complex systems that solve a significant problem. Technical details to understand how the problem is solved are required. Evaluations should involve real users of a system rather than representing a pure academic exercise. The papers will be evaluated according to the significance and practical relevance of the described research as well as with respect to the technical soundness of the described solution and accompanying evaluation.

C) Problem Analysis papers

We invite researchers to also publish problem analysis papers which do not present any novel method, technique or approach to solving a problem, but help to understand the problem itself. Understanding the characteristics of a problem itself is an important task in research and can benefit many people working on the same or at least similar problems. We expect in-depth discussions and analysis of a certain phenomenon or problem, with clear definitions as well as qualitative and quantitative analyzes of the main characteristics of the problem. We also expect a reasonable review of the state-of-the-art stating in how far current solutions fall short. Papers will mainly be evaluated with respect to how general and technically sound their problem analysis is and how useful it will be for other researchers working on the same problem. We expect that such papers will guide future research by highlighting critical assumptions, motivating the difficulty of a subproblem or explaining why current techniques are not sufficient, all corroborated by quantitative and qualitative arguments. Evaluation criteria will also include appropriate categorization of the problem area and description of present solutions and approaches; and appropriate description of the limitations of the present solutions and approaches.

D) Validation papers

A fundamental characteristic of research is that it should be reproducible. In some disciplines, reproduction of results by others is a basic research activity. We would like to encourage researchers to reproduce and validate methods, results and experiments etc. proposed by others before in a new context or application, on new datasets, under new assumptions etc. The goal is clearly to reach interesting and significant new conclusions about the method/approach in question that warrant a stand-alone publication. The reproduction of results should thus lead to new knowledge about the method in question or reveal inherent problems in the assumptions of the original research or limitations of previous solutions. Papers will be evaluated with respect to the soundness of the rationale for reproducing a certain approach as well as with respect to the new knowledge that is generated by reproducing the approach in question. A clear comparison between the results obtained through the reproduction and the original results are mandatory.


Resources