Skip to main content

Scrum: Is Scrum Master a Dummy

From a collection of writings I wrote in 2007 when I first explored this methodology.



Yesterday one of our Scrum Masters made a comment "What can I do? I have nothing to say. For all practical purpose I am a dummy and it is the team which is responsible". I thought that that was a very stupid comment to make and at the same time was puzzled at that opinion.

The textbook definition of a Scrum master is a "person who ensures that the Scrum process is used as intended. The Scrum Master is the enforcer of rules and sprints of practice. The master protects the scrum team from impediments and distractions".

To me this role is quite powerful albeit that unlike a typical project manager, the scrum master does not enjoy the directive controls over the team. But played well, he has lot of leverage to influence the team, lead the team towards right practices and insulate the teams from external distractions. The closest analogy I can think for this role is that of a shepherd who lets the sheep graze by themselves while at the same time keeps a watch over them, ensure that the flock is protected enough and insulated from external threats (if the threats arise, he acts to mitigate the same), and lead the flock towards greener pastures.

I also believe that this role mandates that the Scrum Master devote sufficient energy towards the overall health of the team so that the team productivity level can increase. My personal opinion is that the Scrum master should be an active participant at each and every stage of the scrum development cycle and should facilitate and help the team to accomplish the committed objectives. By no means, can I accept the statement that this role is a "Dummy Role".

Comments

Popular posts from this blog

Overview of Hadoop Ecosystem

Of late, have been looking into the Big Data space and Hadoop in particular.  When I started looking into it, found that there are so many products and tools related to Haddop.   Using this post summarize my discovery about Hadoop Ecosystem. Hadoop Ecosystem A small overview on each is listed below: Data Collection  - Primary objective of these is to move data into a Hadoop cluster Flume : - Apache Flume is a distributed, reliable, and available system for efficiently collecting, aggregating and moving large amounts of log data from many different sources to a centralized data store. Developed by cloudera and currently being incubated at Apache software foundaton. The details about the same can be found here . Scribe : Scribe is a server for aggregating streaming log data. It is designed to scale to a very large number of nodes and be robust to network and node failures. Dveloped by Facebook and can be found here .  Chuckwa : Chuk...

Big Data: Why Traditional Data warehouses fail?

Over the years, have been involved with few of the data warehousing efforts.   As a concept, I believe that having a functional and active data  ware house is essential for an organization. Data warehouses facilitate easy analysis and help analysts in gathering insights about the business.   But my practical experiences suggest that the reality is far from the expectations. Many of the data warehousing initiatives end up as a high  cost, long gestation projects with questionable end results.   I have spoken to few of my associates who are involved in the area and it appears that  quite a few of them share my view. When I query the users and intended users of the data warehouses, I hear issues like: The system is inflexible and is not able to quickly adapt to changing business needs.  By the time, the changes get implemented on the system, the analytical need for which the changes were introduced is no longer relevant. The implementors of...

Dilbert on Agile Programing

Dilbert on Agile and Extreme Programming -  Picked up from dilbert.com - Scott Adams.