CHAPTER 1

Changing Role of Processes and Information

The importance of information is underlined in many of today’s businesses by the presence of an IT (information technology) or MIS (management information systems) function. This function can range in size from a single individual in a small business to a department as that business grows in size and then to a large internal organization, generally reporting to the top level of management, for enterprise businesses in multiple locations and having a large number of employees.

Now some of you may say at this point “Isn’t the addition of IT functions what integrated information and process management means?” The answer is complicated because the role of IT functions is in a constant state of evolution driven by hardware technology advances, software development, economic pressures, security threats, and changing management strategy. These forces can drive an IT function to focus on directions that are not as in alignment with the basic goals of a business as they should be.

Evidence of less-than-satisfactory relationships between business operations and the IT functions abound in today’s business news, literature, and satire. When a new CEO takes over a company needing some changes to improve its bottom line, the financial media often reports that one of his/her first actions is to reduce the IT staff distributed in departments throughout the company and to consolidate the remaining staff in the central IT function. What usually is not said is that the distributed IT staff had likely come about because the company’s central IT function was not meeting the needs of the company’s internal information customers.

To be fair to the IT staff, such a situation is not entirely their fault. Many basic business processes have not been updated to take the best advantage of the features current information technology can provide. In addition, many workers do not have the level of knowledge to take full advantage of the tools available to them or to communicate effectively to the IT staff why the solutions provided by IT are not helping them to do the job their boss wants them to do.1 These mismatches cause considerable frustration when small-to-medium businesses (SMBs) try to implement scaled-down versions of enterprise IT applications such as manufacturing resource planning (MRP), enterprise resource planning (ERP), customer relationship management (CRM), and vendor relationship management (VRM).

In addition, there is often not a consistent or established management policy for handling information across all functions of the company. Individual department managers usually have some rules in place, but the rules may not be the same in other departments they need to share information with. Having a policy that clearly defines what rules need to be followed by all functions is necessary for good information and process management, but insufficient if it does not allow for some localized policies for addressing situations not covered by the general policy. This is particularly true when the standard database does not have room for the extra information required by a single department. Many conflicts with a central IT function occur as a result of this situation. What SMB professionals should learn from these failures is discussed further in chapters 5 and 6.

Let’s take a moment to re-emphasize that the discussion in this book is directed more toward SMB professionals than toward those employed by larger enterprise organizations. The economy of scale in major corporations allows them to develop strategies and functions that are not economically practical for much smaller organizations with limited resources. For example, having separate functions or departments to handle IT-related activities and strategies is not the norm in smaller businesses. SMB IT support often is on an as-needed basis from external sources or is assigned internally to just one or more individuals who also are likely to have other responsibilities to perform in the business. That said, the basic considerations for integrating information with other business processes still apply; they just require some modification when the size of the organization is considerably different.

A Little History

Given that businesses of some form or another have existed for more than a few millennia, the presence of an IT function as we know it is relatively recent—only a few decades. While computers, cell phones, and other common electronic information handling and computation devices appear to younger business professionals to be commonplace, that condition was not true for many senior-level managers and professionals who began their working careers before the 1990s. For those of us who gained our formal training in trade schools, colleges, and universities in the 1960s to 1980s, the pace of technological change has been staggering and the ability to process larger and larger amounts of data much faster continues to grow exponentially in both dimensions of quantity and speed. Keeping up with these trends and adapting quickly to them is an ongoing major challenge to businesses, government, and the public and a necessary continuous education requirement for professionals wishing to ensure the continuation of their careers.

A brief review of the history and nature of the factors and technological developments that have driven and affected the relationship changes between IT and other business processes will provide some background understanding and help set the context for our discussion later about future needs and strategies. While some advances initially were innovations searching for an application or driven by governmental and military needs, most of the later advances are results of the ongoing desire by businesses to perform necessary activities faster, cheaper, or more efficiently and accurately. These factors and developments, in rough chronological order, are as follows:

Development of devices for the government to handle large amounts of data faster and perform more accurate and complex computation;

Basic theory and analysis of an information communication system

Use of mainframe computers by large commercial organizations;

Development of programming languages;

Establishment of internal functions to manage data processing assets and technical skills required;

Use of a network of terminals for input and output to a mainframe computer;

Development of magnetic storage technology;

Development of integrated circuit technology and the microprocessor;

Invention of the personal computer;

Development of spreadsheet software, followed by word processing, e-mail, and graphics applications;

Establishment of an information-sharing network for research and government institutions—the genesis of the Internet;

Establishment of satellite and fiber optic transmission capabilities;

Expansion of the World Wide Web into the Internet;

Development and implementation of broadband Internet access;

Development and expansion of cell phone communications;

Development of digital photography and the software to manipulate and transmit digital images;

Exploding growth of global business enterprises;

Cloud computing, virtualization, big data; and

A return to the terminal and mainframe model using the Internet.

Author’s Note: The dates quoted in the following history narrative are approximate. If one reviews references available on the Internet and in public libraries, there is often some conflict as to the exact date some technological event occurred, one of the many examples of the need to be careful in using data collected by others. In some cases, I am able to comment from personal experience that has included engineering contributions to and management of different aspects of these technologies and the rapidly changing IT environment. This experience has included not only developing and fabricating parts of the hardware such as integrated circuits, memory devices, and sensors, but also writing software applications for both machine control and data analysis.2

As populations and the businesses supporting them grew, the amount of data to be processed by larger institutions and governments increased to the point that manual bookkeeping methods typified by Bob Cratchit’s employment in Dickens’ A Christmas Carol were no longer up to the task in a timely and an accurate manner. Simple tabulating machines were reported by Pascal and Leibnitz as early as the mid-1600s, but it was not until the late 1800s that a commercially available tabulation machine using key entry, the Comptometer, was developed by Felt.3 While Jacquard used punched cards as early as the early 1800s for controlling looms, Hollerith is more recognized for his use of punched cards for recording the 1890 US census data and inventing the tabulator and sorting machines to tally the data represented by the patterns of holes on those cards.4 His company was merged with others in 1911 to form a company that later became the foundation of IBM.

Electronic device developments in the early 1900s for communication purposes also led to the development of analog computers for performing higher-level math functions such as integration and differentiation. This capability used electrical circuit analogs formed of amplifiers, resistors, and capacitors whose output voltages or currents represented the desired mathematical outcome. During the WWII era, the first electronic digital computers were created and the exact dates for some developments are likely unknown because they were developed to support military needs during that time. More familiar models developed after WWII were the ENIAC1 vacuum tube computer in 1946 and the first commercial computer UNIVAC5 available in 1951.

A core concept of information technology often not noted in the business community was described in a landmark paper on communication theory published in two parts in 19486 by Shannon, an electrical engineer and mathematician at Bell Labs, where he considered the problem of communicating information quickly and accurately from one place to another (see Figure 1.1a). The basic elements are converting the information into a form that can be sent, a means of sending that form, and then converting that form at the desired destination into information that can be understood by the receiver.

image

Figure 1.1. Evolution of information communication processes: (a) basic model described by Shannon (1948), (b) model updated to include information processing for sending and receiving with storage buffers, and (c) a typical modern model including information processing while in transit. While these models show only one-way communication, they are also applicable to two-way communications.

This basic process has been used in many different ways by businesses and military groups throughout history to communicate between places and persons. Some examples are couriers carrying handwritten messages and maps, semaphore signals, heliographs, the US Pony Express, carrier pigeons, the telegraph, telephone, and radio. What became new with the development of computing technology was the ability to process information at the sending, transmitting, or receiving steps in ways that could speed up its transfer,7 alter its content in a useful way, or increase its accuracy in the presence of possible disruptions (noise) during each step of the communication process.

The development of this ability to process information in new ways was supported by the use of mainframe computers in large institutions for accounting functions and research computations in the 1950s and 1960s. Because these computers required large capital investments, and new technical skills were required to operate them, departments were established in these enterprise companies to manage these assets more efficiently. In the beginning, university courses in programming and computer systems struggled to keep up with the pace of development in hardware and software applications.8 Some progress was achieved during the late 1950s and 1960s with the development of software programming languages such as Fortran IV for engineering programs and COBOL for business applications.9 By the early 1970s, more workers were able to access these mainframe computers using terminals with CRT displays and keyboard entry, moving away from the earlier need of preparing punched cards and paper tape to enter data.

The presence of what became known as IT functions in large corporations became more established as the number of computer users in a company multiplied. At this time the use of computers was still primarily confined to number crunching and managing sales and purchasing transactions. While some inputs and outputs were alphanumeric for mailing addresses and basic customer information, the use of spreadsheets, word processing, e-mail, and graphics was yet to come. The use of computers for managing communications would not have been possible without the development of faster and easier-to-use information storage technologies to buffer the differences10 in operational speeds of computers, input/output (I/O) devices, and transmission methods as shown in Figure 1.1b.

This development initially was driven by mainframe users needing larger and faster data storage capacity for computation and record keeping. Writing data magnetically on ferric media replaced punching holes in cards or paper tape. Banks of tape drives11 became common in many financial institutions and smaller, faster magnetic drum units were employed in research computers. Eventually, rapidly spinning magnetic disks replaced tape drives in the 1960s and, through continuous innovation,12 became much smaller while providing significantly more storage capacity.

Transistors, used initially for low-power control and audio applications, began replacing vacuum tubes in computers as their performance at higher frequencies improved with advances in semiconductor fabrication technology. This reduced the size and considerable power consumption of mainframe computers and increased their speed and reliability. More importantly, it allowed the use of thousands of computing elements within a single computer instead of hundreds. The development of integrated circuit technology in the 1960s allowed combining hundreds of transistors on a single silicon chip at first, then thousands, millions, and billions as the fabrication technology evolved to manufacture more devices per chip. This enabled the development of the microprocessor (a single programmable computer on one chip) that provided faster computation capability in a much smaller device and at a much lower cost. This also drove the development of semiconductor memory devices13 to have a storage memory of comparable size whose write/read speeds could keep up with the microprocessor’s computation speed. The same technology would be used later to fabricate the high-resolution image sensors used for digital photography. Without these design and concomitant manufacturing advances, it is hard to imagine how today’s computers, digital cameras, cell phones, flash drives, and home entertainment devices could have come about.

How disruptive these technological advances could be to established businesses is demonstrated by the introduction of the first pocket scientific calculator, the HP-35, by Hewlett-Packard in 1972. Although its introductory price was about eight times the cost of a top-of-the-line slide rule, its computation speed, digital display, and range of math functions made it instantly desirable to engineers and scientists. Other manufacturers, most notably Texas Instruments, followed with similar versions, which helped bring prices down, and with the introduction in 1974 of a programmable version, the HP-65, slide rule manufacturers had all but disappeared by 1980.

Similar advances in graphics and printing capability affected the traditional drafting equipment businesses. The design engineering lofts of my youth containing rows of drafting tables, T-squares, triangles, drafting instruments, and India ink pens are all but gone in today’s business workplaces, replaced by computer work stations running engineering design software and large-scale plotters for the decreasingly fewer drawings that are not sent directly by the workstations to CNC (computer numeric control) machines electronically.

For those readers whose working career started in the 1970s and 1980s, the heady days of implementing computer control systems and personal computers entering the business environment were challenging and rapid paced. Smaller computers such as Digital Equipment Corporations’ PDP-11 and Hewlett-Packard’s HP-1000 and HP-9825A became available in the industry in the late 1960s and 1970s for use as test system controllers or in factory control applications, replacing the previous manual or hardwired relay logic methods for controlling manufacturing equipment. This allowed more accurate and complex control that also could be easily changed to fit custom customer requirements. However, their individual cost and the user training required prevented their consideration by business at that time for general use in office applications.

When technically savvy individuals were exposed to what these smaller computers could do in their universities or workplaces, some of them began to work at home or in their garages developing computers they could afford to own personally and experiment with.

At first, the earlier personal computers such as the Apple II, TRS-80, and Commodore Pet were considered by businesses as only toys for individual computer enthusiasts. While a number of factors evolved that made these computers part of the business environment, the author’s opinion is that the development of spreadsheet software such as VisiCalc was a major contributor. This software allowed individual business users to play what-if games with data in a familiar accounting format of rows and columns of numbers.14 Once businesses began to invest in personal computers for spreadsheet analysis, the increased demand helped drive down the cost per unit and the potential software market size allowed other applications such as word processing and graphics to grow rapidly. Such applications allowed businesses to make more effective use of what was still a considerable capital investment per worker.15 Significant contributors to these changes were the introduction of IBM’s PC16 favored by numbers-based businesses and the Apple Macintosh Computer17 favored by graphics-based businesses.

As more and more information became available in electronic form, universities, research institutions, and government organizations began sharing data, developing computer network protocols and communication links to facilitate this need.18 ARPANET developed by the US Department of Defense’s Advanced Research Projects Agency was one of the first of these networks throughout most of the 1960s and is reported as first achieving multicomputer capability in 1969. A decade later, USENET provided the ability for users to dial up using telephone lines to access information and post messages to others in user forums, sometimes called bulletin board systems (BBS), and newsgroups. Larger corporations with geographically distributed locations began developing their own private e-mail systems at the department level and these became more available to individual business users as personal computer installations expanded in the workplace.

The World Wide Web of information that had only been shared among large corporations, research institutions, and government organizations expanded to a much wider individual and commercial audience in the 1990s. Graphical interfaces (browsers) made it easier for users with limited programming skills to access information and view richer content. Processing speeds and the ability to handle larger amounts of data at the same time increased, allowing the use of computers to process data faster at the sending and receiving stages and now even during the transmission phase (see Figure 1.1c). These processing speeds often eliminate the need for buffering data at the transmitting and receiving ends. They also allow the monitoring and analyzing of streaming data in real time for a faster response to changing conditions such as bad weather along a transmission path or a significant change in the data reported by a remote sensor. However, this ability also increases the security risk because it provides new opportunities for unauthorized users to access and alter data.

A number of businesses began developing applications to serve this rapidly growing audience. While many of them who invested in the Internet financial boom during the 1990s did not do well, this activity helped establish the infrastructure of satellites,19 fiber optic networks, cell phone towers and networks, and data compression techniques supporting the communication and e-commerce applications that many of us take for granted today.

The evolution of electronic imagery had its roots as early as the 1920s, driven by the desire to send images using radio waves or telephone/telegraph lines. Some of us had our first experience with electronic images watching Sunday afternoon Buck Rogers or Hopalong Cassidy serials on the small black-and-white screens of early television in the late 1940s and early 1950s. While some digital images were available on university and government databases in the 1970s, the common use of imaging technology in business did not expand until the 1980s when flatbed scanner development and printer output technology were able to keep pace with personal computer implementations. The real change that affected how we use and distribute visual information was the advent of the digital camera.

While integrated circuit technology was not yet capable in the early 1980s of providing a sensor with adequate resolution and exposure sensitivity for moving images, the situation changed when Kodak developed the first megapixel sensor using CCD technology in 1986. Little did Kodak realize that they had just set in motion the technological changes that would lead to the significant reduction of their core business in slightly less than two decades. Rapid advances in digital photography resulted in much higher resolutions and more compact inexpensive cameras in the late 1990s. A concomitant transition from analog video recording tape to digital media like CDs, DVDs, and flash memory modules made taking and storing high-quality images easier and more affordable. This transition was a major contribution to the changes in how we process, handle, and transmit information. In particular, it allowed the reduction or even the elimination of print media for communicating information. The impact of this on traditional postal, billing, advertising, and printing businesses has been significant and is still evolving.

Some form of global economy has existed even before the days of Marco Polo, driven by the need for some products such as minerals, spices, or crafted items that are not available at all geographical locations. Wars have been waged to obtain control over some of these locations. The rapid growth in information technology has enabled a corresponding growth in global economies, which has altered supply chain strategies in ways that are still evolving. Some major changes are the growth of cloud computing, use of enterprise-level software across many locations, virtualization of hardware systems, and the concept of Big Data. All of these trends have interestingly led some businesses to move their IT approach back to an updated version of the mainframe–terminal access model used in the 1950s. We will discuss aspects of this in subsequent chapters.

Business Process Evolution

How business processes are viewed also underwent significant changes during the same period; both from the influences of the same technology changes experienced by information handling, and by the change from a primary focus based on manufacturing a physical product to one more focused on providing a service consisting of mostly intangible items. As a result, process models must be updated from describing a sequence of operation steps to models that include organizational interactions, levels of customer involvement, peripheral support activities required, links to IT functions, and most important—the flow and content of information required by the process and its partners.

To illustrate some of the effects of these changes and the need to consider information processes more thoroughly with other processes, let’s consider two examples. Example 1.1 discusses changes in a common medical procedure using some of the technological advances mentioned earlier. Example 1.2 discusses the changes in a common process model when the associated background and information processes are considered.

Example 1.1. Medical X-ray Process

Many of you have had an X-ray taken for medical reasons at some clinic, dentist, or hospital. If it was taken a few years ago, the basic process probably was something like that illustrated in Figure 1.2a. For a moment, disregard the physical activities required and recognize that this is basically an information process. The input is a set of instructions or request for information from the doctor and the desired output is the X-ray image of the area of interest that will be further analyzed by a radiologist. The transfer of this information is likely a written instruction from the doctor as to what area is to be X-rayed and possibly what the doctor wants to investigate or verify. The output is a piece of photographic film showing the desired image.

image

Figure 1.2. Evolution of medical X-ray process: (a) using chemical processing of the X-ray image and paperwork for other information communication and storage and (b) dematerialized information processes using digital imaging and electronic storage of data.

The process of taking the X-ray takes some time because the exposed film must be processed to develop the desired image and there is a reasonable likelihood that the X-ray might have to be taken again if the initial exposure is not detailed enough or focused accurately on the desired area, the developing process is not optimum, and so forth. Not shown are the background activities required for the successful completion of this process. These could include insurance authorization, maintaining an adequate inventory of photographic film and development chemicals, and some place for the patient to wait in case a retake is necessary.

Today, this process is likely to be considerably different and to take much less time in more developed parts of the world as shown in Figure 1.2b. The need for photographic film, associated chemicals, and a film developing facility is eliminated as a result of improvements in digital imaging sensor technology for X-ray exposures. This also provides the advantage of an image that is electronic in nature. The percentage of retakes required is reduced with the only remaining major cause likely being that the initial exposure was not focused accurately on the region desired. Not so obvious is that the instructions from the doctor are also likely to be received electronically by the X-ray facility in an interoffice e-mail format. The other background activities are also affected. The insurance information is likely to be handled electronically and there is no longer a need for a separate patient waiting area for retakes because they can be taken almost immediately.

This example illustrates two important points. First, the way information is obtained is undergoing significant changes that allow a process to be performed more quickly, more accurately, and in a more usable form. Second, the more usable form of electronic data allows equally significant changes in how that information is transmitted, shared, and stored. This dematerialization of data from physical media such as paper and photographic film has the greatest impact on the improvement of current business and information processes.

Consider the X-ray film image output from the process shown in Figure 1.2a. It must be reviewed by a radiologist to obtain the proper interpretation of what it depicts. Because it is in a physical form not easily copied, its interpretation requires a local radiologist, who may not be available outside of normal working hours, a serious consideration in an emergency room situation. Storing that film for future reference takes up space, requires physical identification labeling and proper cataloging for future retrieval, and requires some protection to ensure its image quality.

Today, however, that image can be sent to any radiologist in the world for interpretation at any time and can be easily copied so that it can be shared among several radiologists in different locations. This helps ensure that at any time of day in any location a prompt interpretation of an X-ray is available for emergency situations. The storage space required is also reduced and has the added feature of being readily available to anyone in the world.20

Example 1.2. Fast-Food Restaurant Customer Process

As one of the work exercises in the combined process management plus IT introduction course required of business school sophomores at Oregon State University, I asked my class to get together in groups of two or three students to create a process diagram of the steps that they, as customers, would need to go through to obtain an order of food from a typical local drive-through fast-food restaurant. While readers from some parts of the world may not be familiar with such a business arrangement, the basic process for ordering food from a vendor, paying for it, and departing with one’s order is fairly universal. A typical student group answer is shown in Figure 1.3a.

image

Figure 1.3. Fast-food restaurant customer process as modeled by typical business students: (a) initial model primarily based on what they as a customer would experience, (b) an updated model after they were asked to add what local unseen activities might be required, and (c) a more refined model after being asked to consider what might be necessary to support the steps in (a) and (b). Chapter 3 will discuss this process in more detail using three different modeling approaches.

As shown, many students just connected the dots with a straight line. At this time, we discussed the importance of using arrows to indicate what happened next, particularly when a decision loop might result in an earlier step needing to be repeated, as illustrated in Figure 1.2a for the medical X-ray example. Like that example, the basic customer process at a restaurant requires a significant amount of information to be exchanged and processed. Most students usually recognize the need to tell the restaurant what they want and the need for the restaurant to tell them in return what their order costs. Students who are more astute include a step where they read the menu before choosing what to order. This is another information transfer step where the restaurant communicates to its customers what are the available food choices and their respective prices.

The student groups were then asked to consider what might be going on behind the scenes to enable the restaurant to process their order and to add those steps to their process diagram and draw a dashed line between their original answer and the added steps. This served to introduce them to a basic service blueprint diagram. A typical student group result is shown in Figure 1.3b.

Most of the students added obvious steps for cooking food, pouring beverages, and packaging their order. We then had discussions about whether each step was likely performed by a different person, which steps could be performed by a single person, minimum staff required, and what information processes might be needed. This was followed by a discussion of the importance of somehow showing the information flow between steps or operations. Was the flow one-way or an exchange? Was a step a creator of information, or a user, or both? Was the information stored or retrieved, was it shared, was it duplicated? Finally, students were asked which step they considered was the most dependent on accurate information for the success of the customer process. Most selected the obvious ordering step, and some selected the payment step.

The work exercise was concluded by asking the students to give some thought as to what further background steps were necessary for the successful completion of the steps they had added to Figure 1.3b. Each group was asked to add those to their service blueprint and to separate those new additions with another dashed line for an expanded service blueprint. A typical result is shown in Figure 1.3c.

A number of students were likely to be stuck at this point because they had not yet become familiar enough with business processes to consider factors such as employee training, purchasing, work schedules, and so forth. However, they often recognized the need to have adequate supplies, utilities, and a janitorial staff. The information needs of these necessary background activities and functions are typically more complex and more likely candidates for the applications we will discuss later.

You are encouraged to work this exercise on your own, giving some thought to the information processes that are necessarily associated with physical processing steps such as preparing food and collecting payment. What do you consider is missing from these typical student results? What could be done to ensure the accuracy of the order information or the payment process? How would you depict the information processes on this service blueprint?

The two examples provided in this introduction illustrate the need to think differently about how information processes are treated in relationship to other business processes. To obtain the best improvement in both, their interaction and co-dependency must be considered as an integral part of any process analysis and improvement effort. A subsequent book will discuss the methodology for such efforts in more detail. Modifications to traditional process analysis and improvement approaches to integrate information will be part of that discussion. For the moment, let’s return to our current topic of integrating the management of information and processes.

This can be difficult to do in some businesses with long-established IT functions whose purpose has not changed much from the days when their primary goal was managing large capital assets and technical resources to meet an organization’s computation and reporting needs reliably, efficiently, and accurately at a minimum cost. A common strategy was to standardize databases and software applications, which worked well when the number of users in a company was limited. But when the number of potential applications of information processing expanded beyond payroll, purchasing, and sales applications, the one-size-fits-all approach began to frustrate newer users whose data needs, formats, and less structured information did not match the standards in place. Despite the grumbling among some groups that what IT provided did not fit well with what they were asked to do, the necessary use of terminals to access the mainframes in most companies helped enforce an approach of making do with what standards the IT group had established.

This grudging acceptance changed when personal computers capable of running standalone software applications were added to a company’s information processing capability and began replacing the terminals. Now technically savvy users had the ability to get things done and store data locally, eliminating the need for mainframe access to do their work. This led to multiple databases containing some of the same information, but usually in different formats, making it difficult to correlate results among different departments and users. Different word processing and spreadsheet applications also led to electronic documents that could not be easily shared with others not having the same software. These disparities even extended to different printing formats depending on the printer manufacturer selected by a department.21

Some IT functions responded harshly to this chaos, persuading upper management to reinforce the standards the IT group had selected.22 Others took a hands-off approach, focusing their efforts on their previous core responsibilities and leaving the personal computer users to support their own systems and to resolve major data conflicts between departments.

To alleviate some of this friction, many software developers and hardware manufacturers responded by providing translator routines or adapter modules to allow their particular software application to read documents created by other software applications and to print using a variety of printer protocols. The problem of multiple databases and different computer applications and hardware in different departments remains to this day. One can often read in the business news about some new CEO working to reduce IT operations cost by eliminating IT staff distributed in departments and consolidating databases. This situation illustrates a lack of an integrated management approach to information and other business processes. As discussed later in chapter 5, this lack is one of the major causes for failed or ineffective implementations of cross-functional software platforms such as ERP and CRM.

Some larger enterprise businesses have established MIS functions to provide more of a focus on the business information side of IT as opposed to a hardware infrastructure focus. In some cases, the MIS function replaced a company’s previous IT group, particularly when the primary activity of the company is not manufacturing. In other organizations where manufacturing is a major component, an MIS function coexists with an IT function that handles daily operations and the hardware and network infrastructure. The effectiveness of an MIS function in either situation can be measured by how well it manages to avoid the proliferation of local lower-level situations described earlier, as well as its successful implementation and use of cross-functional software platforms.

Author’s Note: The definition of what a function does in a company is often modified to best suit that company’s needs. Hence, there is no universal definition of what the differences between an IT and an MIS function are. What distinguishes one from another is often blurred, accounting for the existence of only one or the other function in some companies and the presence of both functions in other companies.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset