Survey: Topics for Document Analysis Summer school
ICFHR 2014: Call for Competitions and Tutorial Proposals
Call for Papers
Call for Datasets
Call for Contributions
Please contribute relevant news to TC10 groups. Please send any relevant event, notice or link to the newsletter editor : Alicia Fornes.
Message from the editor
Welcome to the October edition of the TC10 newsletter, which includes information about the GREC 2013 post-proceedings in LNCS, the MAURDOR Campaigns, the next call for papers and the ICFHR call for competitions and tutorials.
We also would like to foster the participation in the survey for the next Document Analysis Summerschool that was publicited in the TC11 newsletter:
The Tenth International Workshop on Graphics Recognition (GREC 2013), organized by the IAPR TC-10, took place at Lehigh University, Bethlehem, PA, USA on August 20-21, 2013, just before ICDAR 2013, Washington, D.C., USA.
Once again, the GREC workshops prooved to be an excellent opportunity for researchers and practitioners at all levels of experience to meet and to share new ideas and knowledge about graphics recognition methods. The level of interaction was intense and rich as usual.
This year, 29 participants registerd for the workshop, 4 of which, unfortunately encountered visa delays and could not attend. Represented countries were Brazil, China, France, Germany, India, Japan, Luxemburg, Malaysia, Spain, Switzerland and the USA.
Full access to presented papers is available from the Program page (http://grec2013.loria.fr/GREC2013/node/13), but is restricted to attendees only. A selection of fully reviewed papers will be published by Springer as an LNCS volume in 2014.
Post-conference proceedings in LNCS
Selected papers will be published in post-conference proceedings in the Springer Lecture Notes in Computer Science series.
- November 30, 2013 : extension, revision, and re-submission - February 28, 2014 : 2nd round revision again based on the review comments - April 2013 : camera-ready version sent to LNCS - June-July 2013 : Publication by LNCS. Copies sent to individual registered participant
The GREC 2013 organizing committee
DAS 2014: Call for Papers: EXTENDED DEADLINE
The 11th IAPR International Workshop on Document Analysis Systems (DAS 2014) Tours, France, on April 7-10 2014.
The DAS 2014 Paper Submission Deadline has been extended to October 14th, 2013; 23:59 UCT. However, authors should submit the title and abstract of their paper to the submission site by October 7th, 2013; 23:59 UCT.
Abstract due date (full paper) : October 7, 2013 Full Paper Submission : October 14, 2013 Notification of Acceptance (full paper): December 6, 2013 Camera-Ready Papers Due : January 17, 2014 Short Paper Submission : December 20, 2013 Notification of Acceptance(short paper) : January 17, 2014
Survey: Topics for Summerschool
Following the success of the TC11 endorsed "International Document Image Processing School" (IDIPS 2013) this year (see http://www.facebook.com/idips2013 & https://idips2013.pns.aegean.gr/), and given that a possible second edition of IDIPS is currently under consideration, TC11 would like to receive some feedback from the community that will help us better manage future proposals for similar educational activities.
In order to ensure that such future educational activities are accessible to the majority of TC11 members, and that the scientific programme of such activities is as interesting and useful as possible, TC11 would like to seek the views of the research community with regard to the topics to be covered by TC 11 summer schools and possible dates for their organisation.
The survey comprises 4 questions, and should not take more than a couple of minutes to complete. Looking forward to your feedback!
The TC11 leadership team
ICFHR 2014: Call for Competitions and Tutorial Proposals
The 14th International Conference on Frontiers in Handwriting Recognition (ICFHR 2014) Hersonissos, Crete Island, Greece September 1-4, 2014 http://www.icfhr2014.org/
CALL FOR COMPETITIONS The ICFHR 2014 Organizing Committee invites proposals for competitions to be held under the framework of the Conference in September 2014. Competitions should aim at evaluating the performance of algorithms and methods for a particular task of handwriting recognition.
Proposals should contain the following information: - The names, contact information, and brief CVs of the competition organizers, outlining previous experience in performance evaluation and/or organizing competitions. - A brief description of the competition, including which is the particular task under evaluation and why this competition could be of interest to the ICFHR community. - A draft of the outline of the competition describing which data is planned to be used, how will the submitted methods be evaluated and which performance measures will be used.
The following rules shall apply to the accepted Competitions: - All competitions must run well in advance of the conference. - Datasets used in the competitions must be made available after the end of the competitions. - Evaluation methodologies and metrics used must be described in detail so that results can be replicated later. - Reports (full papers) on each competition will be reviewed and, if accepted (the competition run according to plan and is appropriately described), will be published in the ICFHR2014 conference proceedings. - The results of accepted competitions will be announced during a dedicated session of the conference. - Participants should be encouraged to present their algorithms (if not already done) in a conference paper at ICFHR 2014.
IMPORTANT DATES - 01 November 2013: Submission of competition proposals. - 25 January 2014: Notification of acceptance. - 01 February 2014: Competitions publicised and open to participants. - 21 April 2014: Deadline for submission for review of full papers describing the competitions. Papers must be sent directly to the Competition Chairs. - 10 May 2014: Camera-ready papers reporting on the competitions for inclusion in the proceedings.
If you have any query, please contact the ICFHR-2014 Competition Chairs.
CALL FOR TUTORIALS
The ICFHR 2014 Organizing Committee invites proposals for Tutorials to be held prior to the main conference at Crete in Greece. Tutorials should serve one or more of the following objectives:
- Introduce students and newcomers to major topics of ICFHR research. - Provide instruction on established practices and methodologies. - Survey a mature area of ICFHR research and/or practice. - Motivate and explain an ICFHR topic of emerging importance. - Introduce expert non-specialists to an ICFHR research area.
Proposals should contain the following information: - A brief description of the tutorial. - A detailed outline of the tutorial, including preferred length of tutorial either 3 hours (1/2 day)or 6 hours (1 day). - Characterization of the potential target audience for the tutorial including prerequisite knowledge. - A description of why the tutorial topic would be of interest to a substantial part of the ICFHR audience. - A brief resume of the presenter(s), which should include name, postal address, e-mail address, background in the tutorial area, any available example of work in the area (ideally, a published tutorial-level article on the subject), evidence of teaching experience (including references that address the proposer's presentation skills), etc. - The name and e-mail address of the corresponding presenter.
The evaluation of the proposal will take into account its general interest for ICFHR attendees (e.g., a tutorial on object-oriented inheritance will not be appropriate), the quality of the proposal (e.g., a tutorial that simply lists a set of concepts without any apparent rationale behind them will not be approved) as well as the expertise and skills of the presenters. We emphasize that the primary criteria for evaluation will be whether a proposal is interesting, well-structured, and motivated, rather than the perceived experience/standing of the proposer.
Last but not least, the tutorial should attract a meaningful audience. Those submitting a proposal should keep in mind that tutorials are intended to provide an overview of the field; they should present reasonably well-established information in a balanced way. Tutorials should not be used to advocate a single avenue of research, nor should they promote a product.
IMPORTANT DATES - 01 November 2013: Submission of proposals for tutorials. - 25 January 2013: Notification of acceptance.
If you have any query, please contact the ICFHR-2014 Tutorial Chairs.
Scanned documents processing is an important issue for information retrieval. The MAURDOR campaigns aim at assessing the progress of automatic systems in this area. The goal is to evaluate the ability of the systems to extract relevant information in scanned documents. After the success of the first campaign that took place in spring 2013, the Laboratoire national de métrologie et d'essais (LNE) and CASSIDIAN, an EADS company, will conduct a new MAURDOR evaluation campaign in order to support research in Scanned documents processing and help advancing the state of the art in Optical Characters Recognition technologies.
The LNE and CASSIDIAN provide the following to participants: - Consistent data for training, development and test sets. - Automatic scoring tools. - Common rules needed to assess different steps essential for scanned documents processing.
A workshop will be organized at the end of the campaign to account for the results and compare the approaches of various participants. The evaluation plan is available at www.maurdor-campaign.org.
A heterogeneous database
The MAURDOR evaluations are based on a very heterogeneous database. The training set is multilingual (English, French, And Arabic) and consists of 7,000 different documents corresponding to the following classes: - Blank forms and completed forms (around 12% of the database) - Typewritten commercial documents with sometimes several manual annotations (around 40% of the database) - Handwritten personal letters with sometimes typewritten headers (around 25% of the database) - Commercial letters such as purchase orders or bills (around 20% of the database) - Other documents like newspapers articles or maps...(around 3% of the database).
The test set contains 1,000 documents. The proportion of documents belonging to different categories is the same as for the training data.
MAURDOR is based on a complete processing chain in which five separate modules are implemented. Each module performs a particular function contributing to the complete processing of a scanned document. The following five modules are independently assessed during the campaign : - Task 1 : Segmentation and typing areas (table, text, image...) - Task 2 : Type writing characterization (handwritten or typewritten characters) - Task 3 : Language identification - Task 4 : Optical Characters Recognition - Task 5 : Establishing reading order and relations between areas
Participants can submit systems for individual tasks of their choice. An evaluation will also be performed for an operational application as an end-to-end processing chain. It is evaluated in a keyword spotting scenario.
How to participate?
This evaluation is intended to be of interest of all researchers working on the problem of scanned documents processing. The only requirement is the participation in the closing workshop. All the participants must attend the evaluation workshop and be prepared to discuss their system(s), their results in detail. To participate it is sufficient to fill out the registration form available at www.maurdor-campaign.org
IMPORTANT DATES - Evaluation plan released : 01/07/2013 - Training data available : 01/07/2013 - Beginning of the campaign : 04/11/2013 - End of the campaign : 02/01/2014 - Beginning of the adjudication : 02/01/2014 - End of the adjudication : 10/01/2014 - Workshop : February 2014
October 7 (Abstract), October 14, 2013 (Full paper) (EXTENDED DEADLINE) Short papers: December 20th, 2013
Call for datasets
We would like to remind you that the TC10 and TC11 Web sites always welcome contributions of new datasets or other resources related to the community. We would like to encourage all the TC10 and TC11 members to submit such material to the TC10 and TC11 for archiving. The availability of datasets, ground truth and performance evaluation tools online is not only good practice, but also a requirement for a field to progress.
We would like make a special request to the organizers of recent and future competitions. Independently of whether you have the competition datasets and evaluation tools available through other Web sites, please consider archiving them with TC10 / TC11 as well. Web sites often go off-line and useful resources are frequently lost forever.