Effective collaboration is built on the skills, values, and capabilities of team members, especially those in leadership positions. Nonetheless, technology, used properly, can play an important role. (In the case of virtual teams, it plays an essential role.)
The tools supporting collaboration continue to become more powerful and sophisticated. Sensors can deliver real-time data. Visualization software can provide charts and images in fine detail. Communications can be rapid and rich, providing much of the nuance delivered by face-to-face interactions. Artificial intelligence software can advise, intercede, contextualize, and alert. We have tools to translate, prioritize, and sequence actions. Gaming can be used both for education and to provide experiences that engender trust.
But the technologies available for collaboration must be used with care and discretion. The design of individual tools, as well as the environment where they come together, must take into account the concerns, values, and sensitivities of all the people who use them.
It is all too easy to assume that more is better. More detail, more function, faster speeds, greater transparency, and greater participation by the messaging, advice, searches, and other helps, including the intrusion of artificial intelligence, on the face of it, may seem to make things work better. I’ve often been reminded, as I’ve worked with technology or consulted with usability experts, of a surprise a research partner gave me. He was a veteran of the war in Vietnam, a pilot. He told me that the first thing he did once he got into an aircraft was turning off all the warning systems. The buzzers, bells, and blinking lights intended to alert him to threats created too much confusion in actual combat. He claimed that the pilots knew better than the engineers that distraction was a deadly risk. (I was reminded of this years later when I saw the first Star Wars movie. «Use the Force, Luke!»)
Next time, I’ll talk about how facilitated collaboration tools might come together in an ideal way. But in this post, I’ll review some of the dangers.
Too much information. Not only can this be a distraction, but it can skew judgment. In his book Blink, Malcolm Gladwell talks about how better diagnoses of heart attacks were achieved by reducing the amount of information that went to physicians.
Privacy. One problem I’ve seen with the greater ability to monitor the work of employees has been the loss of initiative, trust, and occasions to learn through mistakes. With moment-by-moment “transparency,” people do not have the opportunity to improvise, take chances, or think for themselves. Privacy is also a concern because corrections are often best done away from the group as a whole. Most leaders know that taking an employee aside and speaking with them about mistakes without humiliating them in front of their peers is a more effective way to correct behavior and maintain trust. Respect is difficult to program into an all-seeing system, but people perform better when they are dealt with as humans with compassion. In addition, there could be legal concerns if privacy is violated.
Opportunity. Just as the best sports team isn’t necessarily the one that looks good on paper, the best person for the job or the best person to provide input isn’t necessarily the one who has perfect credentials. The chemistry between individuals, their experience with each other, their diverse perspectives, and their insights may not be apparent through the algorithms used to match them to assignments.
Sequencing. Often, the order in which information is gathered, decisions are made, and action is taken is enforced by flowcharts, which can be codified in programs. In many circumstances, this can ensure quality. However, as anyone who has experienced overzealous approaches to quality, this can often slow things down by stopping the process until an answer is given, even though other work might proceed in parallel. It can constrict people who have imaginative new ways to do work, and it can enforce approvals, which may not always be necessary. Moving in lockstep has never been a good approach to fresh thinking and innovation and can frustrate the workers who have initiative. In addition, new circumstances may dictate quick adaptation of approaches, and too deterministic systems can make this impossible.
Usability. The longer it takes for someone to master a tool and the more it stumps them or leads them astray, the less useful is the tool. And, of course, one person’s complicated is another person’s simple. Ideally there should be a variety of ways to interact with the tool, levels of expertise should be apparent, and the tool should adapt to the person—their physical skills, their thinking, the way they learn, and the way they best participate on a team.
Relevance. Think of the QWERTY typewriter. The key design, which we’re still locked into, was selected to stop fast typist from jamming keys. That concern has disappeared, and it is not unusual for other tools to require information, to import procedures, and to analyze data in ways that have become anachronistic. Keeping tools, especially those that are part of a larger ecosystem of collaboration technologies, up-to-date is a constant battle and can lead to bugs and security problems that are difficult to overcome.
This is not a comprehensive list of concerns about incorporating technology into systems supporting teams, but it does point to limitations that need to be considered. Overall, careful thought needs to be put into not just the capabilities of the technologies, but the social context in which they will be used.