"My vote for the World’s Most Inquisitive Tester is Shrini Kulkarni" - James Bach
My LinkedIn Profile : http://www.linkedin.com/in/shrinik
For views, feedback - do mail me at firstname.lastname@example.orgShrini Kulkarnihttp://email@example.comBlogger198125
Here are few key points that I have developed since part 1 of this topic.
1. The problem with current "Agile" is it is stuck and dying its death - in rituals and ceremonies. So called consultants and experts of "Agile" - appear to be pushing rituals and ceremonies without explaining the context and meanings behind them. I find it is very surprising to see people feel proud about following rituals in this rationalist, objective Engineering discipline. Do not you find this term "rituals" as unacceptable in our field of software that stands as epitome of human knowledge ?
What happens when you do not know the reason and purpose behind a ritual and simply follow it? One - you will apply it wrongly or apply it (the ritual) correctly to wrong situations. When you do something as best practice - you forget the context in which the practice worked and how same or different is your context. The aura of best practice and cult of expert - just blurs your thinking and you get hypnotized. That's where problems start in Agile implementation.
2. There are many good practices in Agile - sorry - practices that have emerged from the kitchen (not factory) of Agile. These are excellent examples of how smart people have solved the problems in their context. If you understand the context and how problem/solution aligned to the context - you have fair chance of learning, customizing and using the practice to your context. I find practices like lean documentation, dev/test pairing, continuous integration, focus on delivering working software, emphasis on right distribution of automation across technology layers - as good and worth studying. If you start asking - best practice, best tool, best framework, you will miss the background and end up in applying a practice wrongly.
3. Most agree on one thing about Agile - "culture". If you want to make Agile work in your context, you need a cultural change regardless of what is your current culture. This may sound counter intuitive - but it is true. For Agile to work you need culture change.
Here is my prophecy about Agile and Culture - "The culture change you are seeking for Agile to work IS NOT GOING HAPPEN". What is the basis for prophecy? I think culture is made up of people working in groups following rituals while setting aside mostly - rationality. Humans are lazy, unpredictable, fearful, greedy. Humans want to make profits continuously through software. While not fully understanding "intelligence" - humans have set their eyes on "artificial" intelligence as future. Human for problems in culture - seek solutions in processes, frameworks and tools.
If you want Agile to succeed - take these problematic humans out of equation - with them goes need for this trouble of changing culture. Can you ?
What do you think let me know
Here is my advice/suggestion on how one should approach getting answers to the questions that they have on a given topic (this applies to any quest to know something).
Before I answer a question - I will ask you - what do you think? how will you find out? what information or facilitation you need to find answer to this question.
This is how James Bach challenged me when I used to ask him questions in the beginning. As James kept on pushing me back - I realized I must do some homework before ask. In the process, I learnt to find out myself some hints or pointers to question that I have and then seek help by asking "Here is a question" and "Here are my initial thoughts or pointers to this question". "Here is what I find contradicting or not-fitting in". "Here are the sources of information that I used".
Most of the times - through this process of figuring out, you will get answers in 2-3 iterations without any external help. In this process of finding out - when you are stuck, ask yourself, what information do I need? how will get that information?
Give it a try - you will learn to find answers to your questions yourself - that would be a fascinating journey.
Let me do a deep dive into this topic
Its a media hype and sponsored Propaganda
If you read carefully into all such reports and media articles and some logic, analysis - it becomes clear that there is a hype and some group of people with vested self interest have been spreading the news. Most of these articles conclude with a call for the readers to do something to avoid "job loss" or any similar harm happening to them due to automation. It might point to learning some so-called "new tool" or "technology" or "take up a course (paid)" or "get a certification". So, commercial interest is apparent. For media, scaring people on some future danger has been a favorite tool to get its end meet. Be it in health care, business or Politics - spreading news about doomsday has worked well for media to form larger public opinion and even make public take actions. People rush to get themselves vaccinated or or buy a term insurance policy or get a health checkup or Hit Gym (commercial interest again) or take a training course - all such actions have a media negative propaganda in the background. As humans, through evolution we have in our blood, an affinity towards negative or bad news. We are likely to believe a prediction of a bad news than a more compelling good news. Media, Sales and Marketing folks exploit this. Can you see this in the tales about job losses through automation? They will scare you to core. When one is scared - rationality and judgemental faculties of human brain are at lowest level. Thus a bunch of scare folks first form opinions about a theme and almost act as expected by "scare-mongers".
What kinds of job are at danger through automation?
As compared to factories and manufacturing assembly line jobs needing human physical effort in addition to some cognitive efforts/skills - IT/Software jobs are/were considered as white color or brainy jobs. In IT and Software - jobs involve varying degree of human elements and intervention. Geniuses in IT services world, riding on outsourcing wave invented so called "low-risk" non strategic tasks such as data entry and management. These jobs were defined such that it merely required humans to follow some predetermined SOP (standard operating procedure) in a business process. When there is cost pressure, clients would ask service provider to bring in efficiency. How can one bring efficiency in such brain-dead jobs? Explore the option of reducing humans doing job that can be efficiently done by a machine or a software program. Enter "automation". Look around your business or place where you work - what are those jobs that do not require human intelligence and empathy? If you find such jobs - you can see them going away and given to robots of some sort.
In terms of software technologies side - people say older technologies are going away. IT services companies providing outsourced technology services will need to support old technologies as long client pays for it. How long client will stay with old technology? That is a business and political question related to a client's business. Typically there is a huge cost to move from a legacy tech to a new tech - its is called "Migration" or "Re-engineering" program. Since such a "change" involves new learning for the staff, new infrastructure and cost of development/migration - businesses tend to stick around an old tech stack until a point when it absolutely becomes impossible to continue. When did businesses move from Windows XP to Windows 7 as desktop operating system ? Around 2013 or so Microsoft announced end of support for Windows XP. This is an example of technology upgrade. As an individual - if you are stuck with an outdated technology- watch out.
Is this new?
What do you understand from the term "digital"? If it was early 90's - it would mean anything done using a "computer". Year 2000 onwards - it meant something done using internet. In last 6-8 years, it means "mobile". But at the core, in computing technology - the phrase "digital" compares with "analog". When did we last hear about "analog" computing devices? I had nice fun the other day arguing with a colleague on internet is as "digital" as mobile. She believed that qualifier "digital" applies to only "mobile". What will happen if quantum computers make way into mainstream computing - will those computers be called as digital?
Going digital for a business mean, in simple sense, a part or whole of business involve "mobile technology". This shift from desktop computers to internet to now mobile - has been causing many traditional jobs that were performed with "digital" technology - to go away. Just like digital camera era killed likes of photo film maker - Kodak.
Media propaganda makes one believe at first that such job losses are unprecedented and happening for the first time. In the past too - when computers first came, people who resisted them lost jobs as in some sense computer did the work better and cheaper than the humans. Some intelligent ones immediately re skilled themselves and embraced the change. These folks not only survived the technology change wave, some even flourished like never before. Like biological evolution, business constantly keep looking for ways to make more money given constant or reducing capital and resources.
Your career is your responsibility
Software job, fortunately or unfortunately is not a job covered under an employee union (by and large there might be exceptions). When your company fires you without giving proper justification - you cannot knock some outside entity to get you reinstated. Businesses world wide using so called skilled and white collared jobs - can take liberty of downsizing workforce should going gets tough with falling revenues and profits. While on job, keeping one updated with skills in emerging areas of technology and business - becomes responsibility of the individual.
In Infosys related quora post above - mentions that affected people are trained in "cutting edge" technologies. I ask - why do people do or get stuck in "blunt" or "old technologies" in the first place? Why do these folks (if at all they do) want their companies to take care of their careers or skills? Why cannot these folks keep improving the skills based on emerging market conditions? If a company displaces people working on a "blunt" technology due to low or no demand - should you blame the company? While keeping people working on some outdated technology might be a business imperative to companies - getting stuck in outdated technologies with or without knowledge at individual level is detrimental to one's career and society at large
If you are happy with 9-5 cool job that does not require you to any great deal of application of skills or knowledge - be ready to have your job redundant any time. When jobs that do not require skills are lost - media might make noise about this. Again - if you see the vested interest behind these, it becomes obvious that it is an attempt to form public opinion in a specific one way away from the reality. You cannot depend upon your company to keep you in front-line tech or business work all the time. Its your job to be good at what is in demand and then have company to keep on fore-front.
When you hear "automation takes away jobs", ask "what kind of jobs" and what you are supposed to do ? Watch the reaction and share it with me. You should be able to smell vested interest behind such a claim. Would you ?
I would like to put two key lessons that I learned in these years that you can use to make most of your money you are putting into automation
If a test (case) can be specified like a rule - that MUST be automated
Automation code is software - thus, obviously is built on some kind of specification. Most GUI automation (QTP, Selenium) is typically built based on so called "test cases" written in human language (say English). It is the first question that a automation guy will ask while starting automation - "where are the test cases?". In dev world - automation takes a different meaning. In TDD style automation (if you call TDD tests as automation) - test is itself a specification. A product requirement is expressed as a failing test to start with. The approach of BDD throws this context to other boundary, specify tests in the form of expected behavior. So, automated tests are based on specification that is a human language but expressed in business terms (mainly) and with a fixed format (Given-when-then).
Key lesson here is - if a test can be specified like a rule with a clearly defined inference to be drawn from the test - that should be automated. Automating a test means create a program to configure, exercise and infer results of what test is trying to validate. Michael Bolton calls such a test as a check - a meaningful distinction. If a test has human element in it for inference mostly - you cannot possible automate the test in its full form.
How do you implement this lesson in your daily life as tester? When designing a test - see if you can specify it like a rule. If you can then explore ways to write a program for it. Then that test becomes automated. In this way when you are building a suite of tests - some are specified like a way that makes it easy to automate and some are specified in a way that a human tester need to apply her intelligence to exercise and infer.
Automated tests (checks) are like guard to product code
A child asks his father "what is the use of brake in a car". "it helps to stop the car" says father. Kid responds back "no.. I guess break helps driver to drive the car as fast he wants to as he has a means to to stop when needed". On the similar lines - having automated tests around a piece of code - literally guarding the code - empowers the developer to make changes to the code faster. More often than not - bigger speed breakers for development is fear of breaking some working code. Developers are mostly worried about large chunk of legacy code that one rarely understands fully. Having automated test as guard - what happens is test will flag change in the code via failing test. Armed with support of guarded code - developers can now make changes faster and can depend on tests to tell them if any of change made has broken some other "working" code.
How do you implement this lesson? Work with developers and help them creating tests that guard their code. These tests should work like "change detectors". Writing test automation would require knowledge of product code and principles of unit testing. Not for weak hearted GUI QTP/Selenium folks.