From Oct 27-30, 2008, the 5th Community Informatics & Development Informatics conference was held in Prato, Italy. As always, the conference was a great meeting of minds of researchers and practitioners from all over the world, and from many different backgrounds, ranging from cultural anthropologists and process facilitators to hardcore computer scientists.
At the end of the conference, I was asked to give my impression of the direction the overall field is heading in, based on the presentations and discussions conducted. It was very hard for me to summarize the widely varying ideas and projects presented, an overview of which is given here. Instead, my aim was to identify some underlying methodological strands that, when woven together, could help to strengthen the fields of community and development informatics in terms of coherence, generalizability and reusability of research ideas and the practical impact of their implementation.This text builds on that presentation.
Framing community informatics research
Communities and technologies are key to community informatics. These technologies cover a wide range, with sophisticated computer and networking technologies only being one end of the spectrum. Face-to-face and other non-computerized technologies can be just as important, if often not more so. ICTs should therefore always be read as covering this whole spectrum.
Communities and technologies co-evolve in that technologies both afford and constrain behavior of their communities of use, and, in turn, these communities shape the technologies as they are being applied in practice.
Community informatics research and practice cannot be seen separately. Without practice, no research can be tested and validated. On the other hand, no practical project can be done without an in-depth analysis, i.e. research, of many aspects. The term "community informatics research" should therefore always be read as implying both research and practice.
The difference between community informatics and development informatics
in this respect is only one of degree. Whereas development informatics
in general could be said to be dealing more with the practicalities of
technology implementation and adoption, community informatics seems to
focus more on the social interaction aspects of the
technologies-in-use. Still, as with any of these complex but related
socio-technical fields, it seems not very fruitful to overly demarcate
their boundaries.
Aspects of community informatics
When researching the interplay of communities and their technologies, at least four aspects are particularly of interest:
- Context/values. Community informatics research, unlike more traditional branches of informatics, is very strong in analyzing the context of use of ICTs, including the stakeholders involved, their interests and goals, and many essential cultural determinants. In particular, much attention is paid to assessing the community values driving the development and uptake of these technologies. Such values include soft but key notions like passions, energy, empowerment, legitimacy, and social inclusion. Information systems developers not taking into account these context factors and values do this at their, and especially their communities of users', peril.
- Cases. Community informatics research is very much case-driven. This results in rich, "lived" stories about authentic information and communication requirements, rather than the more abstract "user" requirements often elicited in classical IS development projects.
- Process/methodology. This is both a strength and a weakness of current community informatics research. On the one hand, it has identified a large palette of situated communication, collaboration, development, evaluation and other community processes and methodologies. Furthermore, it generally displays a great sensitivity towards tailoring these processes to the authentic needs of the communities in which they are applied. On the other hand, perhaps because these processes are so often case-based and situated, the lessons learnt are hard to generalize and reuse across cases, making it difficult to go beyond "situated methodologies". This is unfortunate, as many community informatics wheels are continuously being reinvented, and the exchange with related fields, such as mainstream IS development, is minimal.
- Systems. Another community informatics research strength/weakness. Much traditional ICT R&D focuses on devoping and evaluating individual technologies, such as office applications, e-mail, blogs etc. However, realistic use cases ever more involve "tool systems" of multiple technologies in concert, and in a specific social context of use, such as a particular community. Community informatics research in general is strong in taking such a comprehensive socio-technical systems view. However, as with its processes and methodology, this systems view is often not framed in more general terms, making it hard to disseminate findings more widely. The paradox is that, in contrast with general IS development approaches, community informatics research seems to have much more of a systems way of thinking in practice, while being weaker in the more theoretical, systematic approach to systems analysis, design, implementation and evaluation. Both community informatics and IS research in general should therefore continue to learn more from and influence one another.
Moving community informatics research forward
Community informatics is a "meta-field", building bridges between existing social and technical paradigms (in both theory and practice). This makes it so hard to frame what is the "right" way to move its research forward, as some of us are leaning more towards the social, others towards the informatics side. Still, how then to address at least some of the process/methodological/systems weaknesses? Some practical steps forward could be:
- Definitions, definitions, more definitions. Some of us seem to be afraid of giving definitions, as they would not capture every possible meaning and would overly constrain discourse. True, definitions, almost by definition, constrain. However, they also afford. They point out new directions and open up unexpected viewpoints. Furthermore, one should be clear about the exact role definitions should play. They are only hypotheses, forcing authors and readers alike to focus, explain, and align their ideas, not to force one particular point of view to be the one and only truth. Providing clear definitions is all the more essential when one reaches beyond the confines of one's own paradigm. Definitions should always be tentative, and multiple definitions should be allowed or even welcomed, as long as they help us become more explicit about stating and comparing our assumptions.
- Identify lessons learnt/best practices. Another step to take would be to more systematically identify and compare lessons learnt and perhaps even come up with some best practices or at least criteria for assessing them. Granted, best practices are in the eye of the beholder and given the situatedness of much community informatics work may be hard to define. Still, as with the definitions, the contexts in which they are to apply should be described as carefully as possible, and their validity in any particular case always critically questioned. In that way, lessons learnt and best practices can help crystallize and convey the added value of our research to ourselves and the outside world.
- Testbeds/collaboratories. In IS research in particular and science in general, much work is currently being conducted on the development of testbed and collaboratory methodologies for large scale, realistic development of technologies in complex, evolving contexts of use. Case in point is the adoption of the Living Labs methodology, already used in some development informatics projects in South Africa, as presented at the conference. More strongly connecting community informatics research to these systematic approaches to socio-technical systems development and adoption could be very beneficial to both fields. Along the same lines, simulation approaches such as system dynamics, could be used to more efficiently come up with realistic testing scenarios in a CI context.
Comments