Past Events in the URC
Past Events in the Institute for Research in Innovative Technology and Sustainability (IRITS)
Past Events in the Institute for Research in Digital Culture and Humanities (IRDCH)
Upcoming Events in the URC
Upcoming Events in the Institute for Research in Innovative Technology and Sustainability (IRITS)
Upcoming Events in the Institute for Research in Digital Culture and Humanities (IRDCH)

Home > Past and Upcoming Events > Past Events in the URC

Past Events in the URC

Seminar on how to write research reports and papers with high quality
Seminar on quantitative data analysis: SPSS and AMOS
Seminar on qualitative data analysis with NVivo
Seminar on tools for citation analysis and journal evaluation
Seminar on research and development funding applications and tips for success
Seminar on how to identify and avoid predatory publishers




Seminar on how to write research reports and papers with high quality

To enhance the research capability of self-financing degree institutions, a seminar on writing research reports and papers with high quality was held on 7 July 2015 at the Caritas Institute of Higher Education. Professor Qing Li, the Director of the Multimedia Software Engineering Research Centre at the City University of Hong Kong, was invited to give the talk.

Professor Li has outstanding research expertise in computer science, having published over 90 refereed journal papers and 210 conference papers in this field. In addition, he is or has been associate editor of several notable journals, such as IEEE Multimedia, IEEE TMM, IEEE Internet Computing, World Wide Web (WWW), and the Journal of Web Engineering.

Professor Li discussed a number of aspects in writing good research papers. First, the research topic should be new. He argued that quality research outputs can only be generated by developing fresh ideas, and that papers which make only incremental advance on a particular topic have lower priority for publication in top-tier journals and conferences.

Second, writing a good abstract is distinct from preparing a table of content, an introduction and a summary. Professor Li shared an exercise to improve one’s ability in writing abstracts, which involved writing a new abstract for a top-tier journal paper and compare it with the original one.

Professor Li noted that journal editors typically look at the title, keywords and the reference list of a paper first, before assigning it to reviewers. The reference list provides the critical first impression of a paper, reflecting the attitude of the author(s). In his view, a poorly prepared reference list makes the editors, as well as the reviewers, doubt whether the author(s) has really read the cited papers.

Professor Li also provided some practical advice:

1Know the audience and focus on the aspects that concern them most. Authors who are clear about the purpose of the paper are often able to deliver the message more effectively.
2Limit the scope of research and address only one or two problems in a single paper, and discuss them explicitly and comprehensively.
3Write in a hierarchical structure to increase the attractiveness of the article, while maintaining the clarity of key points and the logical flow. Also, one has to provide strong evidences and proof in order to convince the readers about the results presented.
4Demonstrate a good motivation in the introduction and keep it concise. Professor Li suggested that the introduction section, coupled with the literature review section, should not exceed a third of the whole paper. An example of a good introduction was provided in the PowerPoint slides (see the link below).
5Distinguish between contributions and research steps. While research steps only report on what you have done, contributions highlight what you have achieved, which imply a certain extent of novelty. To write good research papers, one has to make contributions with a strong basis and convincing results, and present appropriate claims.
6Avoid using long sentences. Short sentences are less ambiguous and are more understandable. Professor Li further noted that long sentences are signs of potential errors.
7Avoid using too many relative pronouns (e.g. ‘that’, ‘which’, ‘where’) in a paper, as readers often find it difficult to resolve what the relative clauses are referring to.
8Provide adequate figures. Professor Li mentioned that reviewers are often delighted to see some figures and diagrams on a cursory reading.
9Cite relevant works from the programme committee or editorial board members. This effectively forms a positive attitude from the editors towards the paper.

On publishing outputs, Professor Li noted that researchers may extend published conference papers into journal papers as long as a sufficient amount of new material (e.g. 30%) is included. However, making a published journal paper into a conference paper would be regarded as a duplicate submission.

When using the findings from a same project for another paper, one must provide a substantial amount of new materials. Professor Li noted that a minimum of 70% of new materials is a common practice; otherwise, the paper would be regarded as a duplicate submission.

For further details, please refer to the PowerPoint file of the seminar.
<back to top>

Seminar on quantitative data analysis: SPSS and AMOS

As one of the seminar in the series on Research Capability Enhancement for the Self-financing Degree Sector of Hong Kong, a seminar on advanced statistical and modelling techniques using SPSS statistics and AMOS was held on 24 July 2015.

For this four-hour seminar, Ms Brenda Lee, a licenced SPSS trainer, was invited to be the speaker. During the seminar, a variety of multivariate analysis of variance (MANOVA), repeated measures ANOVA and linear mixed models were reviewed. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) techniques, and structural equation modelling (SEM) were also discussed. The seminar also explored and demonstrated the application of these statistical techniques, including situations in which they are used, the assumptions made by each method, how to set up the analysis, and how to interpret the results.

For further details, please refer to the PowerPoint file of the seminar.

You may download the PowerPoint files of previous workshop and seminars on quantitative data analysis using SPSS, by clicking the links below:

•  Workshop on quantitative data analysis: Introduction to SPSS
•  Seminar on quantitative data analysis: Differences between groups
•  Seminar on quantitative data analysis: Correlation and regression
•  Seminar on quantitative data analysis: Scale's reliability and validity

<back to top>

Seminar on qualitative data analysis with NVivo

A seminar on qualitative data analysis using the software tool ‘NVivo’  was held on 21 August 2015, as one in the series for Research Capability Enhancement for the Self-financing Degree Sector of Hong Kong. It was given by Ms Sue Bullen, who is the Training and Research Consultancy Regional Manager (Asia Pacific) of QSR International, the company that develops NVivo.

During the seminar, Ms Bullen introduced a wide range of functions of NVivo. They include creating a project, importing Word, PDF, image and video files into NVivo and coding them with different nodes. Various coding methods were discussed and demonstrated. Ms Bullen also illustrated techniques for exploring data using the word query function, and ways to visualize and present the findings.

NVivo has been made available to OUHK academics. Interested staff may contact the General Office of the relevant Unit for further details.

Prior to this seminar, another related seminar had been held on 27 May 2015 which provided a preliminary introduction of NVivo. The PowerPoint file can be downloaded here.

<back to top>

Seminar on tools for citation analysis and journal evaluation

A seminar which aimed to introduce the tools and services on citation analysis and journal evaluation provided by the OUHK Library was held on 31 August 2015. Ms Jane Siu-kwan Tsang, Senior Assistant Librarian, was invited as the speaker.

There is a wide range of tools for citation analysis available in the Library. Among them, Ms Tsang focused on the Citation Indexes and Journal Citation Reports, both of which have been widely recognized in the academic world.


The Citation Indexes refer to the core collections on the Web of Science, three of which are subscribed to by the Library – the Science Citation Index Expanded (SCI-Expanded), the Social Sciences Citation Index (SSCI), and the Arts and Humanities Citation Index (AHCI). These indexes cover thousands of the world’s leading journals in their respective fields. Thus, finding references through the Citation Indexes can be effective, as the important papers on a particular topic are usually included.

Furthermore, the Citation Indexes allow users to do a cited reference search, which is useful for forward navigation (checking who cited an article), as well as backward navigation (checking the references cited in an article). In addition, the citation report for a researcher, which provides statistics such as the average citations of each paper and the H-index, can be easily generated. This greatly eases the difficulty of measuring and tracking the significance of one’s research outputs.

While the Citation Indexes provides a convenient way for assessing an article or author, the Journal Citation Reports helps in evaluating the quality of a journal. In particular, they provide the impact factors (including 2-years and 5-years) of each journal in the Citation Indexes, which is useful for academics to identify the appropriate journals in which to publish papers.

Despite their convenience and advantages, Ms Tsang noted that the results obtained from any indexes and tools could be potentially biased, as they include only a small portion of the total publications in the world and focus only on certain disciplines. She recommended that academics should cross-check the results from the Citation Indexes and the Journal Citation Reports with other citation analysis tools, such as Scopus, Google Scholar, and Publish or Perish.

Ms Tsang also noted the limitations of the current methods for citation analysis. As they are mostly quantitative in nature, Citation Indexes make no distinction between positive and negative citations. Furthermore, the impact of an article could be underestimated as most indexes only take into account the number of citations but neglect the number of readings. Also, it often happens that an author has inconsistent naming across different papers, thus making the automatically generated report inaccurate.

Ms Tsang demonstrated in the seminar how to use Citation Indexes and Journal Citation Reports. For further details, please refer to the PowerPoint file and the video recording of the seminar.

<back to top>

Seminar on research and development funding applications and tips for success

The URC organized a seminar to introduce the funding from internal and external sources, and to provide practical tips for successful applications. The seminar, held on 2 September, was given by Dr Kam-cheong Li, Director of the URC.

 In the seminar, Dr Li first introduced the internal funding — the Research and Development Fund approved by the President’s Advisory Committee on Research and Development (PACRD); and the external funding — the Competitive Research Funding Schemes for the Local Self-financing Degree Sector from the Research Grants Council (RGC). As these are two well-known funding schemes, the emphasis of the seminar was on introducing the newly launched internal funding source: the Katie Shu Sui Pui Charitable Trust – Research and Publication Fund (Applied Research).

As each fund has its own requirements and restrictions, academics who wish to submit research proposals are advised to choose carefully between the different funding schemes. A concise comparison was made in the seminar, among the funding from PACRD, the Katie Shu Sui Pui Charitable Trust and the RGC’s Faculty Development Scheme (FDS).

On proposal preparation, Dr Li reminded academics to provide detailed quotations for the budget items, and strong justifications for each item – particularly in RGC funding proposals for employing staff at senior research assistant level. The Principal Investigators need to have expertise and a good track record on the proposed topics. Another important aspect is to follow the guidelines for submission, including details on formatting.

For a detailed comparison of the funding schemes, please refer to the PowerPoint file of the seminar.

<back to top>

Seminar on how to identify and avoid predatory publishers

The academic world has experienced a boom in open-access publishing in recent years. Open-access publishing allows scholarly content be more accessible to the general public. This publishing model, though originating with respectable intentions, has prompted an explosion of poorly managed, unrecognized and solely profit-driven ‘predatory’ publishers.

To raise awareness and provide practical tips on this issue, a seminar on how to identify and avoid predatory publishers was held on 2 September 2015. The seminar was given by Dr Kam-cheong Li and Dr Billy Tak-ming Wong of the URC.

In the seminar, the distinction between credible and predatory publishers was shared. Credible publishers usually disclose the details of their peer-review process, whereas predatory publishers often try to conceal that information by vague statements. Also, for credible publishers, detailed pricing schemes and publishing policies should be provided. Copyright for open-access publications is usually owned by the author, not the publisher; and information about the digital preservation of the journal and the managerial members is also normally disclosed.

 Some online references were recommended to identify predatory publications, including the Directory of Open Access Journals (DOAJ) and Beall’s List. DOAJ serves as a ‘whitelist’, in which the open-access journals included are peer-reviewed and have editorial quality control. On the other hand, Beall’s List provides ‘blacklists’ of predatory publishers and questionable open-access standalone journals. In addition, Dr Wong noted that journals indexed in databases, such as Web of Science and Scopus, are usually credible.

Dr Wong urged academics to be alert when receiving email invitations to submit papers to a journal. Predatory invitations are usually sent via mass emails that have no specified receivers. Some may mimic a formal invitation and even have a fake impact factor. Besides using the whitelist and blacklist mentioned above to check the publishers and journals, Dr Wong also suggested academics should check the name and email address of the sender.

Some other practical guidelines were introduced, as follows:

1 Academics should be sceptical if the period for acceptance notification provided by a publisher appears very short.
2 Use the Ulrichsweb, which provides information on whether a journal is refereed and indexed by reputable databases.
3 Use the Directory of Open Access Scholarly Resources (ROAD) to check the ISSN of journals. Those without a registered ISSN code are potentially predatory.
4 Check the website of the journal. Predatory journals’ websites often contain typos, grammatical mistakes, contradictory details, dead links, and boastful language.
5 Check the name of the journal. Journal names that imply an extremely broad scope or are incongruent with their stated mission should be suspected. Some predatory journals may specify a region in their names with addresses located elsewhere (e.g. a journal named ‘American …’ with a mailing address in India). Dr Wong also noted that some predatory journals use the exact name of existing peer-reviewed journals.
6 Check the editorial board. Most predatory publishers do not reveal much information about their editorial members. There are also cases where several journals have duplicated editorial boards. However, it is possible that irresponsible publishers put one’s name on the board without prior consent.
7 A straightforward way is to google the journal to check if its status is disputable.

Dr Wong acknowledged that no single approach is completely reliable. Academics are recommended to check a journal they are not familiar with using different approaches before submitting their papers.

For further details, please refer to the video recording of the seminar.
<back to top>


    

 

© Copyright 2015. The Open University of Hong Kong. All Rights Reserved.