Wednesday, July 31, 2019

Literary Criticism Essay

The beauty about literature is that it can be interpreted in a number of ways and all those ways can be regarded as a possibility and even entirely true. No matter what angle, approach or perception a person uses to see, analyze and scutinize a literary work, that analysis will always be considered as a â€Å"possibly correct† literary crticism.However, this is also the problem regarding literary criticisms since there is no wrong analysis, anything can be true. Thus, an analysis does not necessarily help readers in further understanding the text, analyses just give readers the chance to look at a text in a different light.This is what Liane Norman gives readers when anliterary criticism was written regarding the famous work of Herman Melville—Bartleby the Scrivener with a subtitle of A Story of Wall Street. In Norman's analysis, â€Å"Bartleby and the Reader†, she gives importance on the relationship of the text, Bartleby the Scrivener, and the readers itself of the text. According to Norman, the text focused on having the reader as an important character or making the reader play an important role in the structure and interepretation of the text.This analysis of Norman is with loopholes, as with many literary criticisms. She does indeed present a rather effective and convincing judgement on Bartleby the Scrivener but she failed to point out the important of the text on its own. It is as if, the text cannot exist without the role that the reader played in the creation of the text when in fact, Bartleby the Scrivener can be seen as a creation which is intended to point something out to the reader.This argument is what will be contained in this â€Å"criticism† of a literary criticism wherein a premise is presented that Norman did deliver a good argument and judgement on Bartleby the Scrivener but she failed in delivering a good argument that should have not illicited further contestations. In Herman Melville's Bartleby the Scrivener: A Story of Wall Street, a scrivener (or copyist or clerk in a firm), is the protagonist while the narrator is the protagonist's boss but who, it seems, wants to be the protagonist himself.Tthe Lawyer who is Bartleby's boss gives too much information about himself and too little information on the protagonist, Bartleby. The story starts off in the Lawyer going on about how he hired and met Bartleby but not before going to great lengths as he introduces himself: I am a rather elderly man. The nature of my avocations for the last thirty years has brought me into more than ordinary contact with what would seem an interesting and somewhat singular set of men, of whom as yet nothing that I know of has ever been written:—Imean the law-copyists or scriveners.(Melville, 2006) The Lawyer and Bartleby soon venture into a rather harmonous and beneficial relationship with each other—that is until Bartleby decides one day that he has enough of being a scrivener and stops doing his job properly to the point that he does not do anything at all. The curious and perverse Lawyer just lets Bartleby be as the scrivener goes on with life doing absolutely nothing. Unfortunately, things get out off hand to the point that Bartleby is imprisoned for hanging out in the building when it is neither his home ror has he any rights to loiter in the place.This ends the story: as Bartleby refuses everything—companionship, food, water—he dies a sad death in prison, all alone. Later on, the Lawyer finds out that Bartleby has been working in the Dead Letter Office wherein he sorts the mail of dead people. The Lawyer associates this previous job of Bartleby as the reason why the scrivener has become depressed and decided to one day, to just let everything go. In Liane Norman's Bartleby and the Reader, the role that the reader plays in giving meaning to Bartleby and the Scrivener is focused on.According to Norman (1971), there is a â€Å"rigorous and demanding human tr ansaction that takes place between the reader and the story†. This transaction is the ability of the text to have meaning only when the reader wills it do so. Thus, the dialogue, lines and other descriptions in the story would be moot and pointless if the reader does not believe otherwise. In fact, the reader becomes a character in the story itself without being in it; as what Norman (1971) asserts, â€Å"the reader is both participant and judge† in the same way that the Lawyer or the narrator of the story is also the participant and the judge.Thus, while the Lawyer is one of the characters in the story, his way of storytelling wherein he is detached from the other characters makes him have the same role as the reader. This in turn, makes the reader as the Lawyer and the Lawyer as one of the readers. But more than this form of analysis, Norman takes the notch further by relating the text and the characters to a greater and more profound extent by juxtaposing it with Chr istian values or ideals and the nature of democracy—two things which are inherently, albeit subtly, presented as the themes in Bartleby the Scrivener.On the other hand, the way Norman anaylzed the literary text was correct in a way that she gives meaning to the context and the content but remiss in her ability to add too much interepretation and meaning to what could have been just simple or meaningless lines. This is perhaps a bigger mistake in not being able to see much meaning in a literary work, that of seeing too much—a case of overreading. Norman was not false in her analysis, but she was extreme in that too much interpretatation is given from too little information.Thus, her mistake was that she was not able to give importance to the interepretation of the literary text as a text itself but she instead, concentrated on the text as how it would be interpreted by the reader. Bartleby the Srivener does not become merely Bartleby the Scrivener but it becomes, instea d, Bartleby and the Reader. Norman placed too much emphasis and importance on the reader as being part of the literary work and literary analysis. But the reader is of course important, for who will analyze a text but that being the reader himself/herself?However, what Norman has done is to indicate that there is but one reading presented by the reader and that is the only correct reading while at the same time the reader is no one but herself. What Norman should have done is present the analysis on the text as being Bartleby and A Reader instead of having it as Bartleby and THE Reader. For using the determiner â€Å"the† indicates that there is only one reader and that one reader is and will always be right.Thus, Norman's analysis gives a vaild credibility to her argument—even if the argument is indeed credible, it is unfortunately not valid. Although, there is something which is admirable and commendable in Norman's analysis which is the last part of her article wher ein she gives a profound interpretation on the implication of the Lawyer's last lines regarding Bartleby's death: The deep sense of disappointment that the story inspires in the reader is a function of the aura of America's hight but impossible promises: men have not escaped their limitations simply by founding a new policy.Bartleby is the test of democratic- Christian principle. If his resistance exposes human shortcomings, his persistence reveals man stubbornly laying claim to his humanity. (Norman, 1971) Norman maginificently gives a clear idea and interepretation on Bartleby's death while at the same time, relates its implication to humanity which is humanity's tendency to gain new insights but miserably ends in not carrying out those new â€Å"policies† or insights.Over all, both Melville (in using the character of the Lawyer) and Norman are correct, society stubbornly believes in their humanity—even if it proves that their idealist views on humanity's â€Å"huma nness† is sometimes misplaced. References Melville, H. (2006). Bartleby the scrivener: a story from Wall Street. Great Short Works of Herman Melville. New York: HarperCollins Publishers, Inc. pp. 19-38. Norman, L. (1971). Bartleby and the reader. The New England Quarterly 44 (1): 22-39.

Tuesday, July 30, 2019

Dc Power Supply Design

Abstract: The main aim of this assignment is to design a pre amplifier circuit with an NPN transistor to be used in a simple public address (PA) system. The pre amplifier is fed from a microphone that produces an average output voltage of 10 mV rms. The amplifier is to operate over a frequency range of 300 Hz to 5 kHz and should have an adjustable volume control. The expected gain of the amplifier is 100.First we are going to design an amplifier for given specifications, model the operation of the circuit using h-parameter and r-parameter model, use computer aided design software to analyze the circuit performance and demonstrate the working of the circuit by hardware implementation. Then, we will plot the frequency response of the circuit and analyze the effect of the emitter bypass capacitor. Finally we will compare the mid-band gain, bandwidth and lower cutoff frequency obtained from the simulation result and the hardware implementation with the designed values. Chapter 1Introduct ion: Bipolar Junction Transistor (BJT) is a three terminal device with three regions (Emitter, Base and Collector) and two PN junctions (Emitter-Base junction and Base-Collector junction). Since there are two junctions that means there are four possible ways of biasing a transistor. If both junctions are forward biased then the transistor will operate in the saturation region. If both junctions are reverse biased then the transistor will operate in the cut of region. These two conditions of operation are used when the transistor is needed to work as a switch.To use a transistor as an amplifier, the emitter base junction should be forward biased and the collector base junction should be reverse biased. Amplifier is an electronic circuit that can amplify signals applied to its input terminal. If an AC signal is given to a transistor amplifier it will produce an AC base current. This AC base current will produce a much larger AC collector current since IC=? IB. The AC collector current produces an AC voltage across the load resistor RL, thus producing an amplified, but inverted, reproduction of the AC input voltage in the active region of operations.DC load line is a sloping straight line connecting all the operating points of a transistor biasing drawn on the output characteristics of the transistor and the intersection point gives the Quiescent point (Q-point). A prober Q-point should be in the middle of the DC load line. Selecting a good Q-point prevents the transistor from going into the cutoff or the saturation region and gives more stability. A fixed bias (i. e. base bias) circuit or a voltage divider bias circuit can be used for this assignment but a voltage divider circuit is more efficient.The main disadvantage in a fixed bias circuit is that ? ac depends on temperature, which means ? ac is not stable. And when ? ac changes, IC will change(IC=? IB) and VCE will change. The changes in IC and VCE make the Q-point unstable. Whereas in voltage divider bias c ircuit, IC is independent of ? ac and hence the Q-point is more stable. Voltage divider bias is widely used because reasonably good stability reached with a single power supply. Chapter 2 Problem Description: The problem is to design and fabricate a pre amplifier circuit with an NPN transistor to be used in a simple public address (PA) system.The input of the pre amplifier circuit is taken from a microphone that produces an average output voltage of 10 mV rms. The amplifier is to operate over a frequency range of 300 Hz to 5 kHz. Also, it should have an adjustable volume control. The expected voltage gain of the amplifier is 100. Design Specifications: Voltage gain = 100 Lower cut off frequency = 300Hz Vin = 10mV (rms) RL = 10k? DC power supply = 10V to 15V Type of transistor – NPN We will begin our assignment by selecting a suitable transistor. Then we will decide on a DC voltage supply and assume a prober Q-point (IC, VCE) to carry out the design.We will start the design by calculating the values of Resistors RC and RE and the voltage divider resistors R1 and R2. After that we will calculate the values of the two coupling capacitors (C1 and C2) and the emitter bypass capacitor (CE) for the required cut off frequency. After finishing the mathematical model we will simulate the circuit using OrCAD to analyze the circuit performance. Then, after finishing the simulation, we will assemble the circuit using approximate values of the calculated ones. Finally, we will compare the simulation results with the hardware results.The results we will be focusing on are the voltage gain, the cutoff frequency and the Bandwidth. Chapter 3 Circuit Diagram and Design: Av = 100 FL = 300Hz Av = 100 FL = 300Hz Figure 1 – Circuit Diagram Step1 – Selection of Transistor, Supply Voltage (VCC) and Collector Current (IC): The selected transistor should have a minimum current gain (? ) that is equal to or greater than the desired voltage gain. Therefore, we will us e Q2N2222 in this assignment. Since the output voltage swing is not specified in this assignment, we will choose 12V as our voltage supply. We will choose IC as 4 mA. Transistor: Q2N2222Supply Voltage: VCC = 12 V Collector Current: IC = 4 mA * To carry out the design we need to draw the dc equivalent circuit. Figure 2 – DC Equivalent Circuit Step2 – Design of Collector Resistor (RC) and Emitter Resistor (RE): VCE = 50% VCC = 50% ? 12 = 6 V VE = 10% VCC = 10% ? 12 = 1. 2 V VRC = VCC – VE – VCE = 12 – 6 – 1. 2 = 4. 8 V RC = VRCIC = 4. 8 V4 mA = 1. 2 k? RE = VEIE = VEIC = 1. 2 V4 mA = 300 ? , since IC ? IE Step3 – Design of Voltage Divider R1 and R2: ? = 100 (data sheet) R2 = ? RE10= 100? 30010 = 3 k? VB = VBE + VE = 0. 7 + 1. 2 = 1. 9 V VB = VCCR2R1+R2 R1 = VCCR2VB+R2 = 100? 3k1. +3k = 16 k? * Now we need to draw the ac equivalent circuit. Figure 3 – AC Equivalent Circuit Step4 – Design of RE1 and RE2: RE = RE1 + RE2 Rout = Rc || RL = 1. 2? 101. 2+10= 1 k? r'e = 26mIE = 26mIC = 6. 5 ? AV = Routr'e+RE1 r'e+RE1= RoutAv = 1k100 = 10 ? RE1 =10 – r'e = 10 – 6. 5 = 3. 5 ? RE2 = RE – RE1 = 300 – 3. 5 = 296. 5 ? Step5 – Design of Coupling Capacitors C1 and C2: hie = Rin (base) = ? (r'e+RE1) = 100 ? (3. 5 + 6. 5) = 1 k? Rin (tot) = R1 || R2 || Rin (base) = 1116+13+11 = 716. 4 ? XC1 = Rin(tot)10 = 716. 410 = 71. 64 ? C1 = 12? fLXC1 = 12 300? 71. 64 = 7. 4  µF XC2 = RC + RL = 1. 2 + 10 = 11. k? C2 = 12? fLXC2 = 12 300? 11200 = 47. 4 nF Step6 – Design of Bypass Capacitor CE: R’S = R1 || R2 = 16. 09? 316. 09+3 = 2. 5 k? Re = RE2 ||{ R’S ? + (r'e+RE1)} = 296. 5 ||{ 2500 100+ (6. 5+3. 5)} = 296. 5? 35296. 5+35 = 31. 3 ? XCE = Re10 = 31. 310 = 3. 13 ? CE = 12? fLXCE = 12 300? 3. 13 = 169. 5  µF Av = 100 FL = 300Hz Av = 100 FL = 300Hz Figure 4 – Circuit Diagram with values Simulation Results: With CE: Mid-band gain, AV = 99. 8 Lower Cutoff Frequency, FL = 334 Hz Higher Cutoff Frequency, FH = 20. 6 MHz Bandwidth, BW = FH – FL = 20. 6 M – 334 = 20. 6 MHz Without CE: Mid-band gain, AV = 3. 5Lower Cutoff Frequency, FL = 305 Hz Higher Cutoff Frequency, FH = 46 MHz Bandwidth, BW = FH – FL = 46 M – 305 = 46 MHz (Circuit Diagram and Frequency Response are enclosed along with this report) Chapter 4 Hardware Fabrication and Testing Details: During circuit assembling process we tried to find the nearest values to the calculated ones. These are the values we used: RC = 1. 2 k? we selected1. 2 k? RE1 = 3. 5 ? we selected4. 5 ? RE2 = 296. 5 ? we selected270 ? R1 = 16 k? we selected15 k? R2 = 3 k? we selected2. 2 k? C1 = 7. 4  µF we selected10  µF C2 = 47. 4 nF we selected47 nF CE = 169. 5  µF we selected147  µF Procedure: . Assembled the circuit on a breadboard and connected a DC power supply of 12V. 2. Applied a sine wave of 10 mV amplitude and 100 Hz frequency to the input. 3. Observed the output waveform in the CRO and noted down the amplitude. 4. Increased the input signal frequency in steps, without changing its amplitude, and noted down the output amplitude at each step. 5. Calculated the voltage gain of the amplifier by the equation, AV = Vout/Vin found the voltage gain in dB by the equation, AV (dB) = 10 log (AV). 6. Plotted the frequency response curve and found the frequencies (fL and fH) for which the gain reaches 0. 07 of mid band gain. 7. Found the frequency range between fL and fH which gives the bandwidth of the amplifier. Hardware Results: With CE: Frequency (Hz)| Vout (mV)| AV| AV (dB)| log f| 100| 182| 18. 2| 25. 20| 2. 0| 500| 662| 66. 2| 36. 42| 2. 7| 1 k| 750| 75. 0| 37. 50| 3. 0| 5 k| 784| 78. 4| 37. 89| 3. 7| 10 k| 786| 78. 6| 37. 91| 4. 0| 50 k| 786| 78. 6| 37. 91| 4. 7| 100 k| 786| 78. 6| 37. 91| 5. 0| 500 k| 786| 78. 6| 37. 91| 5. 7| 1 M| 786| 78. 6| 37. 91| 6. 0| 2 M| 784| 78. 4| 37. 89| 6. 3| 5 M| 770| 77. 0| 37. 73| 6. 7| 10 M| 728| 72. 8| 37. 24| 7. 0| 50 M| 344| 34. 4| 30. 73| 7. 7| 100 M| 182| 18. 2| 25. 0| 8. 0| Mid-band gain, AV = 78. 6 Lower Cutoff Frequency, FL = 2. 6 B = 398 Hz Higher Cutoff Frequency, FH = 7. 35 B = 17. 78 MHz Bandwidth, BW = FH – FL = 17. 78 M – 398 = 17. 78 MHz Without CE: Frequency (Hz)| Vout (mV)| AV| AV (dB)| log f| 100| 12| 1. 2| 1. 58| 2. 0| 500| 32| 3. 2| 10. 10| 2. 7| 1 k| 36| 3. 6| 11. 13| 3. 0| 5 k| 38| 3. 8| 11. 60| 3. 7| 10 k| 38| 3. 8| 11. 60| 4. 0| 50 k| 38| 3. 8| 11. 60| 4. 7| 100 k| 38| 3. 8| 11. 60| 5. 0| 500 k| 38| 3. 8| 11. 60| 5. 7| 1 M| 38| 3. 8| 11. 60| 6. 0| 2 M| 38| 3. 8| 11. 60| 6. 3| 5 M| 38| 3. 8| 11. 60| 6. 7| 10 M| 36| 3. 6| 11. 13| 7. 0| 50 M| 26| 2. 6| 8. 0| 7. 7| 100 M| 18| 1. 8| 5. 10| 8. 0| Mid-band gain, AV = 78. 6 Lower Cutoff Frequency, FL = 2. 55 B = 356 Hz Higher Cutoff Frequency, FH = 7. 6 B = 39. 81 MHz Bandwidth, BW = FH – FL = 39. 81 M – 356 = 39. 81 MHz (Frequency responses of the circuit with and without CE are enclosed along with this report) (Frequency responses of the circuit with and without CE are enclosed along with this report) Chapter 5 Discussion and Conclusion: * First of all, there are several ways and various methods to design a common emitter amplifier or so-called RC coupled amplifier that are completely different than the one we used.We did not choose this method because it is the best method, actually, there is no such a thing called the best method. There are simple ways and there are more accurate ways. It depends on the primary assumptions, the design specifications and the thumb rules used. Simply, the method we used achieved the design requirements and accomplished desired results. * An Amplifier is a circuit that is capable of amplifying signals applied to its input terminal. The main component in any amplifier circuit is usually a transistor.Since the transistor configuration we used is a common emitter configuration, the circuit is called a Common Emitter Amplifier. Unlike other configurat ions, CE amplifier exhibit high voltage gain and high current gain. Generally, the process of a common emitter amplifier can be explained in three steps. First, the AC input signal produces an AC base current. Then, This AC base current will produce a much larger AC collector current since IC=? IB. After that, The AC collector current produces an AC voltage across the load resistor RL, thus producing an amplified, but inverted, reproduction of the AC input voltage. To use a transistor as an amplifier it should be operated in the active region (linear region). To set a transistor in the active region both junctions, Emitter-Base junction and Base-Collector junction, should be forward biased. Since changes in in temperature and other factors during the amplification process may drive the transistor into the cutoff or the saturation region, the Q-point should be in the middle of the active region to enhance the stability of the amplifier. * We preferred using a voltage divider bias cir cuit over other biasing circuits because in this kind of biasing circuits, IC is independent of ? nd therefore the Q-point is more stable. Voltage divider bias circuit is widely used because of the good stability reached with a single power supply. * C1 and C2 are called coupling capacitors. They pass ac from one side to another and block dc from appearing at the output side. In addition to that, C1 act as a high pass filter on the input signal and its value must be chosen so that it does not attenuate the frequencies which are to be amplified. Similarly, C2 also must be prevented from attenuating the output signal. * The bypass capacitor CE provides an effective short to the ac signal round the emitter resistor RE2, thus keeping only RE1 seen by the ac signal between the emitter and ground. Therefore, with the bypass capacitor, the gain of the amplifier is maximum and equal to AV=Routr'e+RE1 . Without the bypass capacitor, both RE1 and RE2 are seen by the ac signal between the emit ter and ground and effectively add to r'e in the voltage gain formula. Hence, AV=Routr'e+RE1+RE2 . * r'e is a dynamic resistor that depends on temperature. If AV was dependent only on r'e, and RE1 was not there (i. e. AV=Routr'e ), AV will be unstable over changes in temperature because when r'e increases, the gain decreases and vice versa.In order to minimize the effect of r'e without reducing the voltage gain to its minimum value we partially bypassed the total emitter resistance RE. This is known as swamping which is a compromise between having a bypass capacitor across RE and not having a bypass capacitor at all. RE1 should be at least ten times greater than r'e to minimize the effect of it. In our design RE1 is less than r'e and hence it will not do anything other than slightly reducing the gain to be about 100. In other words, in our design RE1 is somehow useless. * At lower frequencies, a capacitor will act as an open circuit.At higher frequencies, a capacitor will act as a s hort circuit. That is because the capacitive reactance is inversely proportional to the frequency (XC=1/2? fC). In an RC coupled amplifier circuits, at lower frequencies, more voltage drops across C1 and C2 because their reactance is very high. This higher signal voltage drop reduces the voltage gain of the amplifier. Similarly, at lower frequencies, the reactance of the bypass capacitor (CE) increases and this reactance in parallel with RE1 create an impedance that reduces the voltage gain.This is why RC coupled amplifier circuits have less voltage gain at lower frequencies than they have at higher frequencies. However, at higher frequencies, the reactance of the internal transistor junction capacitance goes down and when it becomes small enough, a portion of the output signal voltage is fed back out of phase with the input, thus effectively reducing the voltage gain. * Our hardware implementation results and simulation results were different. Obviously, that is because we did not find the exact values for our design. There was a notable difference between the design values and the values we have selected, especially for R2.The cutoff frequency (fL=398 Hz) is somehow acceptable but the mid band gain (AV=78. 6) is a little bit less than the desired one. Increasing the value of R2 could have solved the problem. It could have increased the voltage gain and reduced the cutoff frequency. * One of the aims of the design is to have an adjustable volume control. There are several ways to do this. One of them, and I think it’s the best, is by using a variable resistor in place of RE1 (i. e. a 100 ? variable resistor). Basically, this resistor is inversely proportional to the voltage gain (AV=Routr'e+RE1 ).Reducing the value of RE1 will increase the voltage gain, thereby increasing the volume and vice versa. References: 1. Theodore F. Bogart, Jefferey S. Beasley and Guilermo Rico (2004). Electronic Devices and Circuits. India: Pearson Education, Inc. 2. Thomas L . Floyd (2005). Electronic Devices. 7th ed. India: Pearson Education, Inc. 3. HyperPhysics  (2004)  Common Emitter Amplifier,[online] Available at: http://hyperphysics. phy-astr. gsu. edu/hbase/electronic/npnce. html [Accessed: 20th Nov 2011]. 4. Scribd  (2006)  Common Emitter Amplifier,  [online] Available at: http://www. cribd. com/doc/27767944/Common-Emitter-Amplifier [Accessed: 25th Nov 2011]. 5. Visionics  (2005)  RC Coupled Amplifier,  [online] Available at: http://www. visionics. ee/curriculum/Experiments/RC%20Ampr/RC%20Coupled%20Amplifier1. html [Accessed: 1st Nov 2011]. 6. SSIT  (2006)  Analog Electronic Circuits,  [online] Available at: http://www. ssit. edu. in/dept/assignment/aeclabmanual. pdf [Accessed: 5th Nov 2011]. 7. Edutalks  (2007)  RC Coupled Amplifier,  [online] Available at: http://www. edutalks. org/electronics%20lab%20manual%201. pdf [Accessed: 7th Nov 2011].

Monday, July 29, 2019

To The Grils Who Are Going to College Tone Essay

â€Å"To Girls Who Are Going to College† by Helen Keller, Keller uses an overarching passionate tone that shifts from reminiscent to Instructive In order to strengthen college women's confidence. Through repetition of the word â€Å"you†, Keller comes across as sentimental, almost as if she is trying to recall her own memories in the eyes of the reader. By trying to reenact her memories in the form of writing, Keller succeeds in drawing the audience together to sympathize with college women.On the other hand, women visualize themselves through her memories and become reassured that everything will work out fine. Seller's reflective tone is mainly seen in the beginning of the essay, but her nostalgia carries on throughout the whole passage. As the nostalgia starts to fade away towards the end of the reading, Keller transitions into a more commanding, motherly type role and takes advantage of the use of imperative words to convince college women to follow her, while not s ounding o hostile.Sentences start off with authoritative verbs and turn into pieces of advice, such as â€Å"do not forget†, â€Å"Learn from your books†, and â€Å"Rebel against the hardness and injustice†. Keller asserts her ideas in a more friendly and loving way, which enables the audience to obey her advice without thinking she is too authoritative. In return, college women feel empowered by her insights and gain strength leading up to the school year. Helen Keller manages to speak to the audience through her passion by petition and verb usage.By connecting her own memories with the future, Keller enables women to see things through her own eyes. She helps them realize, through use of an overarching passionate tone, that most things will not be expected, but to still push through either way. To The Grills Who Are Going to College Tone Essay By Nikolas passionate tone that shifts from reminiscent to instructive in order to strengthen college women's confiden ce. Through repetition of the word â€Å"you†.

Financial strategy Course -case on Mergers & Acquisitions- Essay

Financial strategy Course -case on Mergers & Acquisitions- - Essay Example The kind of acquisition opted for Capita plc is reverse merger which will provide the benefits of public company thus acquired by it. The expectation of Capita related to the increase in the revenue of Capita plc has well been met. The valuation of iSoft at the point of dealing was AUD 28.5 million from the proceeds of the sale of assets. These earnings of the company were mainly to repay the financial dues of the company. As iSoft Business solutions was considered a non-core part of the iSoft group, so the company took the decision to sell it off as thing wing of the company was yielding negative returns. Moreover, further investment in this particular wing of the company may not have been useful. Thus the decision of selling off the company was taken by the management of the company. However, the major success that Capita plc obtained by acquiring iSoft Business solutions was unthinkable. This led to the company to further acquisition of companies expanding its market in varied fie lds. Thus from the study it can be inferred that acquisition is a better option than a merger. Probable decline in the revenue and the rise in the cost of iSoft Business solutions led to the acquisition of the company by Capita plc. Appreciation of the Australian dollar against the Euro was one of the main reasons behind the decline in earnings of iSoft Business solutions. This eroded the revenue of the company by $108 million in comparison to the previous year. In fiscal year 2010, over 70% of the revenues of iSoft’s revenue were denominated in GBP and Euro. Even the other regions in which iSoft Business solutions operated had more or less flat revenue. Thus the company anticipated a large portion of cost towards the revival of the company but the growth in revenues did not materialize. The major portion of the iSoft’s cost being fixed in nature resulted in the decline in the flow of

Sunday, July 28, 2019

Critical theory, philosophy Essay Example | Topics and Well Written Essays - 2500 words

Critical theory, philosophy - Essay Example that part of a person’s imagination which is justified through the knowledge he or she garners via various life experiences like reading, interaction as well as seeing and hearing. This understanding is a part of one’s overall perspective and subsequent take on life. Therefore, this book is an important one as human understanding is deeply connected with the written word and the way it has been written. If those words, whether in a philosophical text or otherwise, manage to justify an individual’s personal truth, or appeal to his or her sense of balance in life, then it contributes to his or her understanding. In this way, the choice of Locke’s text is an important one for this paper. Locke’s text starts with a study of innate notions. This portion of the book is a study of the elements that lead to speculation and a subsequent formation of perspective. Throughout this part, Locke has managed to hook the reader on to the idea that speculation is an element that must be used in very discreet doses as more of it can damage the practical side of things in one’s mind. (Locke, 2007) As a philosophical notion, this is an ideal that is true to writing. In writing, it is imperative to stick to a certain balance between factual information and a small amount of speculation. This holds on to people’s imagination and memories. Therefore, in this part of the book, Locke has merely described a style of writing. Further into the book, one will find ideals that are connected with principles in the mind. The mind is an organ that churns out thoughts and expressions of the same. These expressions are a part of the basic mental setup of the person concerned. While every individual does not need to be a writer, it has been said often that there is a book in everybody. This is largely an overthrow of the fact that literary skills have been highly respected in many people. It is a desirable quality. This quality, in turn, springs from an ability to form a successful

Saturday, July 27, 2019

Segment 10 and 11 Term Paper Example | Topics and Well Written Essays - 1000 words

Segment 10 and 11 - Term Paper Example On October 1802, Spain's King Charles IV signed a decree that transferred the Louisiana territory to France and the Spanish representative in New Orleans, following instructions Spanish court, canceled Americans' access to the port's warehouses. These moves angered the United States. Jefferson and Secretary of State James Madison worked to attain a feasible resolution through diplomacy, but other factions called for war, so that the U.S. could seize Mississippi and New Orleans. In January 1803, Jefferson recommended that James Monroe accompany Livingston in Paris as minister extraordinary. This would be called the Lewis and Clark Expedition. Jefferson wrote to Kentucky's governor, James Garrard, to notify him of Monroe's appointment. Monroe was to offer $10 million for the purchase of New Orleans and all, or part of, the Florida territories. If negotiations failed, Monroe was ordered to try buying New Orleans, or, at the very least, ensure American access to the Mississippi and the p ort. When Monroe got to Paris on April 12, 1803, Livingston informed him of different circumstances. Napoleon agreed with the recommendation of France's minister of finance, Francois de Barbe-Marbois, that it would be more strategic for France to sell Mississippi to the U.S., to avoid for it being seized by Britain, in the event of a future war. Soon, the U.S. purchased Louisiana from France for $0.03 per acre, or $15 million. This added around 252 million more acres to the American territory. The War of 1812 concerned the military conflict between the United States and Great Britain from 1812 to 1815. One of the main issues was Impressment, where the British could take away British sailors in American ships. Napoleon's statement in 1810 that revoked his decrees and British refusals to rescind their orders increased the pressures for the U.S. to go to war. On June 18, 1812, President James Madison approved a declaration of war that Congress passed at his request, although not withou t significant opposition. The Treaty of Ghent ended the War of 1812, but it did not resolve the fundamental issues that stimulated the war. The Treaty states that â€Å"all territory, places and possessions whatsoever, taken by either party from the other during the war† would be reinstated, as they were before the war. No one gained anything and Impressment was not duly addressed. Synopsis of Monroe Doctrine President James Monroe delivered a speech on December 2, 1823 that included the Monroe Doctrine. In his message to Congress, Monroe provided a set of principles of the Monroe Doctrine: 1) The Western Hemisphere can no longer be colonized; 2) The political system of the Americas differs from Europe; 3) The United States will consider all intrusions in Western hemispheric affairs as a danger to its security; and 4) The United States will cease from participating in European wars and would not agitate European colonies in the Western Hemisphere. What was Monroe trying to ac hieve with his Monroe Doctrine? Monroe wanted to deal with potential threats to the U.S., specifically that which comes from the interests of European powers in colonizing territories in the New World and to ensure that diplomacy is used first before engaging in any war against other nations. SEGMENT 11 Summary The Industrial Revolution began in Great Britain, during the 18th century. In the U.S., the industrial revolution started in the nineteenth century. During this time, the

Friday, July 26, 2019

Elizabeth Murray'abstract art gives me special meanings Essay

Elizabeth Murray'abstract art gives me special meanings - Essay Example She portrayed a gloomy home life by bathing cartoonish technique, comprising kitchen utensils, desks, shoes, and others seen in houses. Personally, her paintings pushed me to think more profoundly about their goofy, ridiculous, and comical nature, but I only felt somewhat disturbed. In her earlier works, Murray depicted human features, by interweaving non-figurative colors, lines, and shapes. She used multi-paneled installations, alongside vibrant and daring colors to fascinate and trick the viewers’ eyes. She made use of every dimension, and is particularly recognized for her designed canvasses (Lacayo para 2-3). Her naughty, silly, and wild technique is all about colorful composition and wild forms against the organized and methodical abstract art. She totally recreated Modernist abstraction into cartoonish humor and essence. The above picture is one perfect example of Murray’s wacky, spirited, yet deliberate, calculated technique. In this painting, she is combining abstract three-dimensional canvases to form scenery of unique shades, colors, and systematic mixtures. It is a large image of a hotchpotch, painting, and figure; mixed all in all and colored vibrantly and raises a sense of wackiness, but sympathetic accuracy which is integrated in all its exquisite disorder (PBS(a) para 4). The application of smooth, horizontal color reveals that she is not attempting to mislead the viewers or make them believe there is something deeper than what has been painted or shown in front of them. Based on my analysis, I think she is trying to guide her viewers to the reality that abstract images can be objects too. Even though these are not ordinary, mundane objects that can simply be recognized, I think she is attempting to copy commonplace objects employing her own artistry, ingenuity, and imagination. She makes use of living organic shapes all over the painting which look like human body parts. By condensing and squashing these humanlike shapes into her

Thursday, July 25, 2019

Assignment # 7 Essay Example | Topics and Well Written Essays - 750 words

Assignment # 7 - Essay Example Policy makers only consider less number of options when tackling problems. The selected policy options only differ marginally with the existing policies. For each option, just the most significant consequences are considered. There is no best policy assessment; an excellent policy is one that all participants consent on rather than what is most excellent to solve a crisis. Incremental policy advocates for making corrections; it centers on small modification to already existing policies rather than remarkable fundamental modifications. In this model, policy-making is also sequential; you have to keep solving problems as mistakes become obvious and are rectified. Fresh approaches to the problems are developed. In this model urgency and importance are the barriers. Since the model advocates for small parts solving of problems, some problems may have serious consequences and need fast solutions. Additionally, some problems may be new and of high risk which need new policies. The mending of already existing polices may not solve the problems. With the analysis of problems before implementing or creation of policies should be advocated so as significant policy change to occur. The Steams theory applies the consideration of three example applications of streams, i.e. politics, policies and problems. However, the streams have drawbacks which hinder effective policy making. First, the independence of the streams is questioned. The streams are said not to be independent which makes them manipulated easily. People tend to embrace solutions that they have a belief that it will solve their problem. They do not identify solutions because that solves a particular issue. Politically, in governments, regardless problems have been solved or not, problems still arise. The streams only come together when solutions and problems are attached and presented to political audiences. The other barrier is that the streams need a proper entrepreneurial culture

Wednesday, July 24, 2019

British Petroleum America, Inc Case Study Example | Topics and Well Written Essays - 750 words

British Petroleum America, Inc - Case Study Example According to the research findings BP had a clean history until the oil spill. A look into its history proves that at least in principle, the company gave attention to health, safety, and environmental standards. For example, in the year 2005 alone, the company invested nearly $ 8 million on promoting the production and marketing of low carbon power from alternative energy sources like solar, wind, and natural gas. Moreover, the company took initiatives to provide low cost liquefied petroleum gas to low income customers. In addition, the company claims that the company prepared new principles in the year 2005 in order to address the increased demand for environmental vigilance. Thus, it becomes evident that the company was very careful to create an environmentally ethical image as it had to access environmentally sensitive areas like Alaska. It was in March 2006 that the worst oil spill in the history of the company took place in the North Slope of Alaska’s tundra. It took fiv e days to discover the oil leak, and by that time, nearly 200000 to 270000 gallons of crude oil spilled into the area. Though the exact reason of the spill is still unknown, the company has been criticized by many for its improper maintenance of pipelines. In fact, the responsibility to maintain and operate the Trans-Alaska Pipeline System where the leak took place is with the Alyeska Pipeline Service Company. But BP cannot evade from its responsibility to oversee that proper maintenance takes place. Though the primary reason, according to many, is the aging and deterioration of the pipelines in the supply system, there are various factors that are believed to be the cause of early deterioration and leakage. The first such factor is the diminishing quality of the crude oil that passes through the pipeline (p. 12). It is pointed out that as more and more oil exploration takes place, the quality of crude oil has declined substantially. The company spokesperson opines that the reason b ehind corrosion can be the presence of water and sediment in the oil (p. 12). On the other hand, the opinion of Steve Marshall, the president of BP Exploration, is that the reason lies in the presence of an emulsion-breaking additive in the oil. The ultrasonic tests conducted in the year 2005 identified increasing corrosion in the pipeline, and as a result, the company increased the budget for pipeline maintenance, and increased the frequency of pipeline inspections. Despite all these efforts, the leak took place at a place where the pipe was buried underground. Anyway, the company and its environmental policies have become a center of media attention. In addition, the Office of Pipeline Safety has directed BP to conduct thorough repairs and investigations and to report the same to the office. The company can use the pipeline only after it receives permission from the federal agency. In addition, the company is likely to face a fine that can go up to $ 2.1 million (p. 10). Lastly, t he efforts of the company to present itself as a ‘green’ one has faced a serious set back due to the incident. 1. From the very beginning

Tuesday, July 23, 2019

Different Aspects of Life of International Students Essay - 2

Different Aspects of Life of International Students - Essay Example There are a few numbers of students who do not suffer the cultural issues, while on the other hand there are many other people who have to face many hurdles just because of the lack of understanding with the host cultures (Luget 2014; Mason 2002). In addition to the cultural aspects, there are many other issues, which can create a problem for the settlement of the student in the international environment, which is not at all his homeland. In this paper, the topic of the research is the concerns of life of an international postgraduate student. By the end of the paper, we will be able to highlight major issues just because of the research based on an actual interview of a postgraduate student. There are many categories of the qualitative interviews as described by the research. The three most common types of such interviews are structured, semi-structured and unstructured interviews. The structured interviews more frequently fallout from incisive quantitative data and therefore the format of this research study would be on either semi-structured or unstructured interview, preferably semi-structured (Robert 2013; Saunders 2006). The unstructured interviews usually refer towards the collection of observational data while on the other hand, semi-structured interviews are the one and only reliable source for qualitative research. Semi-structured interviews are concerned about the around already constructed open-ended questions, or we can say free opinion-based questions. One question direct the interview session towards the next question. More questions could follow relating to the previous one, and the whole perspective could be brought into light (Robert 2013; Saunders 2006) . The most suitable type of interview for the study is semi-structured format just because of the nature of questions. As the topic is ‘Different aspects of a life of an international postgraduate student,’ it would always be a better idea to gain a deep insight about the perceptions and the actual difficulties, which a student may face in a foreign culture.

Antoine Lavoisier-Life, Contributions, and the French Revolution Research Paper

Antoine Lavoisier-Life, Contributions, and the French Revolution - Research Paper Example He studied at College Mazarin from 1754 to 1761, where he was taught several subjects, such as Botany, Mathematics, Chemistry and Astronomy. In 1771, when he was 28 years old, he married Marie-Anne Pierrette Paulze, when she was barely 13 years old. Marie-Anne also took a fancy to chemistry and assisted her husband in translating crucial English documents in French. Furthermore, she came out with a biography of Antoine Laurent Lavoisier by the name ‘Lavoisier’s memoirs’. Lavoisier’s father bought a title for him in 1772, and consequently he came into membership with a privately owned company called the Farmer’s General that collected taxes from the royal government. Subsequently, his wealth and influence amplified considerably. Since he was a member of the Gun Powder Commission, he resided in the Paris Arsenal where he built a private laboratory to investigate and analyze the results of chemical experiments which had been performed by others, and als o to carry out his own. During the year 1791, he was appointed as a Secretary of the Treasury (Scott, 2). Antoine not only came with the discovery and naming of oxygen. He also established the procedure of rusting and asserted the significance of oxygen for the survival of animals and plants by ascertaining its role in respiration. He was also one of the first people who performed some complex chemical experiments, which gave rise to stoichiometry. Furthermore, he also founded the law of conservation of mass and, with the assistance of his chemical experiments, he managed to determine that animals made use of oxygen as a respiratory gas and this gas exchange was a process, which was used to create heat, and it was also very similar to the process of burning of a candle. Other than his role as a physicist, botanist and chemist, Lavoisier also achieved a law degree, but he never practiced law formally in his life. He was a prominent member of the Ferme Generale, and was also one of th e 28th tax collectors of France. During the French Revolution, he was exposed to the ire of the French revolutionaries. Being a liberal, he had to undergo major opposition from Jean-Paul Marat who supported revolutionaries. When the French Revolution was at its peak, Jean Paul Marat pressed treason accusations against Lavoisier for selling watered-down tobacco and several other crimes. During the year 1794, the period of the â€Å"Reign of Terror’, Antoine provided help to some foreign scientists and mathematicians, for example, Joseph Lagrange, under treason (New Advent, 1). The judge presiding over the case of Lavoisier rejected the appeal to forgive Lavoisir’s life and to let him go on with his unfinished work. He said, â€Å"The Republic needs neither scientists nor chemists; the course of justice cannot be delayed.† Consequently, on 8th May, 1794 when Lavoisier was 50 years old, he was guillotined in Paris. Lavoisier’s contribution to the inception of advanced chemistry was primarily concentrated in the field of theory. He added extensions, summarized and confirmed the theories and discoveries of several of his contemporaries in England and the European Continent, particularly Henry Cavendish (1731-1810), Joseph Priestley (1733-1804) and Joseph Black (1728-1799). The consequence was that there was a new and more profound understanding of chemical processes that created the

Monday, July 22, 2019

Health Law and Regulations Essay Example for Free

Health Law and Regulations Essay Health care is high on the list of the most regulated entities. Regulated by the government, the health care sector is also regulated by different private bodies. The Joint Commission on Accreditation of Health Care Organizations (JCAHO) together with the National Committee on Quality Assurance (NCQA) and different medical specialties form part of the private health care regulatory entities that collaborate with the government. Health care regulation is focused on three main roles; cost control, quality control, and access expansion and control. These three functions are subdivided into objectives covering each aspect pertaining to the health care area. While the regulatory program exists to accomplish the three above-mentioned objectives, the implementation of each objective affects one another. Example, quality control causes a reduction of access, and increases the cost because of an increase in demand. Despite the interdependence of these objectives, health care regulation does not indulge competition amongst the regulatory bodies. Important in the regulatory industry are those who engage each other with the same goal towards improving the health care. A majority of the health care regulatory federal agencies in America are comprised within the Federal Department of Health and Human Services (DHHS). The American constitution directs all health care regulators to obey the set legal process as their activities contain the potential to limit or breach the rights of health care. Health care is a high level of bureaucracy and extensive legal procedures. Regulators are provided a notice for their proposed regulation with findings to support it, after which the sector under regulation is allowed to contest or appeal the proposal. The legal process is appealed in every health care procedure, whether if it’s to test a new drug, suspend a practitioner’s license, or a regulation on  environmental standards. The Affordable Care Act (ACA) is a health care regulation signed into law on the 23rd of March, 2010. The law’s main focus had been to increase the affordability and quality of American health insurance. Its policies were focused on lowering the rates imposed on the uninsured through the expansion of both the private and public insurance covers. It had also aimed to reduce the health care costs incurred by the government along with citizens. Barely seven days after its enforcement had a new health care law come into effect with amendments to the ACA. On March 30, 2010, the president of the United States signed into law the Healthcare and Education Reconciliation Act of 2010. The law had been enacted by the 111th US congress (Blackman, 2013). The ACA has advantages as disadvantages, and has been at the forefront of political criticism since its enactment. Its strongest opponents have cited it to be punitive of the high-end earners to cushion the middle and lower classes. It has also been reported to weigh heavily on the nation’s wage bill. In a nutshell, the ACA is designed to cover the majority of American’s health care insurance. However, the regulation’s cost factor has proven unsustainable without economically hurting the high earning entities. The effect of ACA’s implementation has caused an overall negative economic realignment as various entities strive to remain afloat; working hours have been significantly downsized by various corporations in anticipation of unsustainable insurance compliance. The ACA regulation is seen as an economically crippling element in America’s overall economic composite. The enormous tax burden shouldered by high earning entities is evidenced to trickle down to the middle and low income earners which resultantly deduces the benefits intended for these groups in an even more severe way. To begin with, although the ACA provides affordable or free health care insurance to tens of millions of American populations, funding is raised through taxes. With a hike in taxes for health care funding, earning populations are left with less to spend. The American middle and low income groups are even more affected by the adverse effects as inflation sets in to recover the growing deficit induced by the regulation’s implementation. The ACA had been endorsed as an affordability initiative but the repercussive costs have indicated the regulation as a costly affair across the board. Insurance players report certain clauses in the regulation as detrimental to the process. An example is the regulations directive for insurance to extend their coverage even to sick uninsured people at no extra cost. The resultant effect has been the rise in insurance premium costs which further complicates the insurer’s role in the initiative. Nearly all the beneficial aspects within the regulation are countered with contradictory challenges that undermine its purpose. While Medicaid is expanded by the regulation to cover an estimate 15.9 million citizens below 138% of the property level, the cost is met by state and federal funding which further imposes an immense measure of tax escalation. The regulation however features more benefits than limitations with regard to women initiatives. The ACA grants up to 47 million women access to health care services comprising wellness and preventative care. Additionally, the law prohibits women paying more than men for health care services as had been the case prior to enactment (Blackman, 2013). The ACA regulation started 157 new agencies, boards and programs to oversee the efficient implementation of the law alongside regulating health care spending. Although there are negative cost implications associated with the huge oversight entities provisioned in the regulation, proponents argue these costs to be necessary in controlling the unaccounted health care expenditure (Blackman, 2013). Employment in America is currently readjusting to comply with the regulation’s 2015 implementation phase requiring all employers to provide an insurance cover on their employees. The resultant effect to this change has been two faced; small business have been employing part time employees full time to comply with the 2015 mandate while large businesses have been reducing part time working hours to avoid paying the employees insurance when the phase is implemented. The ACA is illustrated as a complex employment factor with many jobs feared to be lost as many new ones are created. Notably, the regulation projects an  outcome where employees will freely leave their respective jobs without fear for losing retirement benefits affiliated to health care. Accordingly, the regulation aims to decrease employees working hours while maintaining and creating new employment opportunities. Despite the employment benefits highlighted within the ACA, many citizens remain skeptical of the upcoming 2015 employer-insurance phase. Dissenting political sentiments are pitching the impending reforms as a negative aspect of the ACA enactment set to diminish numerous job opportunities. Federal and private health care regulation remains as an important component in the broader sense of the health industry covering every single aspect entailed in human health. The quality, cost and access control objectives are characteristic to every health related industry. With regard to personal experience, I have on several occasions observed medical licenses revoked for certain practitioners following a legal process to dispute the quality displayed by the practitioners in context. The two mentioned above 2010 health care regulations contain a complex and mostly long term agenda aimed at bettering the quality of health care services in America. The current challenges are largely short-term and should not be invoked to undermine long-term benefits. A healthy debate is however essential to ensure minimized negations throughout the implementation process. References Top of Form Blackman, J. (2013). Unprecedented: The constitutional challenge to Obamacare. Bottom of Form

Sunday, July 21, 2019

Bowling For Columbine Essay

Bowling For Columbine Essay Throughout Bowling For Columbine an anti-political, critical and persuasive perspective is dominant. Bowling For Columbine is a documentary directed, written, produced and narrated by the controversial Michael Moore. The 2002 film aims to open the eyes of Americans and people worldwide to gun control. The movie is based on the shooting massacre that occurred at Columbine High School, where two students Eric Harris and Dylan Klebold entered their Alma Mater and killed 15 people, also injuring an additional 21 students. The film investigates gun control in the USA and the lack of law and regulation on gun ownership. People have various controversial views on the movie Bowling For Columbine , especially relating to how much of Moore s film is supported by facts. This article will provide an unambiguous view on gun violence in the USA, whilst also seeking to reveal the truth behind the movie and about the persuasive power of documentary. The film positions the audience through the use of convincing techniques to accept the truth set forth in the film, although these issues are very real in the United States. Bowling For Columbine explores various exaggerated representations of the American populous, whilst also bestowing on the audience that there are problems with guns and their second amendment. The filmmaker is superficial with his questions that are pointed, the use of witty, dry and mocking remarks are used in his favour to lighten the fact that it s a movie about people shooting others. From the word go, Moore sets off on his routine prejudice pathway. This included Moore opening a new account at North Country Bank that offers him a gun, whilst asking sarcastic questions like Do you think it s a little dangerous handing out guns at a bank? and not letting the workers give a response. Moore uses such techniques to mock the staff, which helps him achieve his purpose, inadequately proving the truth over the evidence. In addition to Moore s scornful interrogations, he uses music to portray a certain light in the film. A truly touching and upsetting section of the documentary is the montage with What a Wonderful World played over the top flashes are shown of America s decisions in the past relating to war and foreign involvement. The use of Louis Armstrong s song is ridiculing the American government, which makes a suggestion how it isn t a wonderful world , in fact the opposite. The flashing by of the clips of people dying, being shot and interracial foreigners carrying American built guns aesthetically gives the feeling of a mismanaged government. Moore does this to turn his audience from the political leaders to his personal views through making the audience distraught. Furthermore throughout the film it continues on making the audience feel further troubled. This is experienced in the scene when the 911 calls overlay the slow motion video footage walking through the corridors of Columbine High School. Which is intended to position the audience as a first hand student and gives a distressed feeling due to the audience feeling remorse for these dying students. Then it cuts to video footage from the cafeteria on April 20, 1999, watching shots fired, students hiding scared under the lunch tables, bombs exploding, fires starting and students running to get out. This major scene, influences the audience to feel upset and the tone gives a scared feeling, which reinforces the issue of gun control. Following the scene vividly re-living the Columbine High shootings, Moore switches to the then NRA president, Charlton Heston, as he screams his famous five word line From my cold dead hands (Heston, 2002) and waves a gun above his head to a roaring crowd. A voiceover is played which explains how just ten days after the massacre, the NRA held a pro-gun rally in Denver despite the pleas of the community in mourning. Moore pushes this negative representation of the NRA when in fact Heston didn t scream his five words on this occasion; it actually was one year later in Charlotte (refer to picture underneath the column). Also the NRA meeting after the flint shootings occurred 8 months later. Moore uses his power to portray Heston as a villain, through using illusion of reality to converting the audience s perspective. Throughout the first hour, Moore uses an aggressive stance and reasons why America has a high rate of gun related violence, but to meet his intention he contests with a counterexample and provides a description. To start off he states the overwhelming number of guns must be the reason, and then subsequently he states that Canada has about the same ratio of guns, but only a third of the homicides. Moore also discusses Europe in comparison to America s violent history. During the closing scene of Columbine Moore is filmed going to Hollywood to interview and ridicule the NRA president Charlton Heston. At first when asking for the interview Moore appears as a keen and eager fan, then criticizes Heston. During the interview Heston repeatedly pauses and doesn t respond to the question asked, Moore uses these as an advantage to silence his opinion and asks relentless and rude remarks. Moore is a coward for taking advantage of Heston who was in the early stages of Alzheimer s Disease (A brain disease leading to a decrease in mental power). In the end, Heston quickly leaves after getting up and announcing that the interview is over. Moore uses this illusion of reality, portraying the negative NRA stance into the final scene making the audience reflect. After the final scene, Moore uses additional voiceovers providing additional bias. The documentary targets Americans and teenagers throughout the world, especially those who don t have a clear knowledge of why there is so much gun violence. Throughout the film there are bursts of music and loud grasping sounds that are used to grab the audiences attention. Moore uses a series of cleverly edited together loud and shocking clips, which are a combination of visual and auditory footage. This is designed to keep the attention and confuse the audience. Shocking and explosive newsflashes, strange circumstances, frustrated interviewees, sardonic twists, and animations all joined together into an hour and fifty-seven minutes, the movie is best described as a documentary for the new generation. Furthermore Moore uses analytical features and prejudicial techniques, which position the audience to accept his point of view over the NRA or Charlton Heston. Moore has been ridiculed for editing to su it his aim, which isn t following the documentary genre. Moore correspondingly marginalises to suit his aim. In the film, it is focused on an anti-gun stance, but there isn t time for pro gun enthusiasts to voice their opinion. Moore repetitively edits out responses to his questions so that it doesn t affect the state of mind he wants his viewers to feel, also mocking people through his voiceovers. Although his techniques are arguable, the issues of which he discusses are of significance. Gun loving is as American as having a pie on thanksgiving, although he shows more opinions antigun related, he doesn t make opinions up for his viewers, he used rhetorical questions which let the audience think about what they are watching. Columbine is a well-organised documentary, which helps raise issues in a political controlled society, and in the end it lets viewers think, which a lot of modern day movies lack. Do you believe that Moore depicts the American culture correctly? A group of people living and breathing in fear? Do you suppose that he is telling the truth? On first viewing the movie I believed Moore, did you?

Saturday, July 20, 2019

Effect of Globalization on IT Service Providers in Europe

Effect of Globalization on IT Service Providers in Europe Opportunities and challenges presented by Globalization: IT Service providers in Continental Europe EXECUTIVE SUMMARY Enterprises within Europe are increasingly trying to seek the advantages of global sourcing. Unlike enterprises in U.S. or U.K., continental European countries have historically been reluctant to engage with offshore providers. The reasons were far stretched, ranging from political sensitivity, labor laws, cultural compatibility and language requirements. Globalization, however, is creating new avenues that European companies can not ignore. A recent report by Gartner shows the potential IT Offshoring market to be in the range of about $ 200 to 240 Billion. The market is expected to register double digit growth for years to come. The current offshore spending by firms amounts to just $17 Billion worldwide. This clearly shows a big gap, a huge market potential which is yet to be exploited. The huge demand has also led to emergence and growth of several new players in the field of IT Outsourcing/ Offshoring services, this is leading to ever increasing competition in the marketplace. In order to cope up with this increased competition and to provide better services, these service providers are increasingly adopting Global delivery models. By selecting an advantageous and cost effective proportion of resources worldwide, Global Delivery Model boosts business performance while also lowering costs. It also helps the supplier deliver requirements that are met on-time, within budget, and with high quality; greater efficiency and responsiveness to their clients. In Europe, nearshore models still dominate the market. But these models are continuously being updated, with more and more providers setting up Offshore Development Centers in locations like India. A framework for building an optimal combination of onsite, nearshore, and offshore delivery capabilities is provided by Capgeminis Rightshore ® model. A recent Gartner report has suggested that, the current US economic slowdown is expected to lead buyers of IT services to consider increasing the percentage of their labor in offshore locations. India will remain the dominant location for IT offshore services for North American and European buyers as a result of its scale, quality of resources and strong presence of local and traditional service providers. INTRODUCTION: EUROPEAN IT MARKET The European market remains a highly complex and competitive market with a large number of providers. Mergers and acquisitions will continue but will be balanced by new market entrants Outsourcing adoption in Europe is increasing for both infrastructure and applications; the widespread lack of well defined sourcing strategies among buyers and the realities of ever-changing business requirements will generate frequent deal negotiations and renegotiations Global delivery and utility services are irreversible trends evolving at different speeds among various European countries. The European multi country, multi language/culture composition increases the evolutionary complexity of these trends Selective outsourcing with multiple providers will remain the preferred model of engagement for European buyers. Governance and end-to-end integration/management of different providers/solutions are the most challenging aspects of it ITO market maturity varies: UK is the most matured IT market in Europe. The other European markets are maturing at different speeds. An acceleration in ITO adoption is now apparent in countries such as France and Germany A focus on achieving service delivery excellence and the best value/quality balance is increasingly driving European organizations (especially those beyond the first generation deal) to consider selecting multiple providers for an outsourcing contract. For example, in the IT Telecom sector, the most common division is by service tower, with customers opting to choose different providers for their network, desktop, data center and application competencies. At the moment, however, providers tend to join forces in an opportunistic manner, as a response to customer demands. This is the cause behind the ever-changing composition of the providers teams; as a consequence, consolidating best practices to manage IT service spin offs between different providers in an effort to guarantee end-to-end service delivery excellence remains challenging. As the number of providers engaged is set to increase, this challenge is likely to intensify. It will also be driven by other market characteristics, which include a persistent tactical use of outsourcing by European customers, insufficient process maturity, and lack of clarity in the definition of roles and responsibilities. As we look at global delivery, it is fair to say that there are two major misconceptions that still exist among the European market: 1) Global delivery is often considered as a synonym of offshore, and 2) IT services delivered through global delivery capabilities are application services. In reality, in the past few years, the European market has witnessed a considerable expansion in terms of both geographical location options (in areas such as Eastern Europe or North Africa, for example) and portfolio of services offered (now including, for example, help desk and remote infrastructure management services). Global delivery and offshore, however, remain the key deal characteristics that need to be treated with extra care in many European geographies, and as a consequence, many deals remain confidential. Traditional providers investment will be directed toward enhancing existing capabilities (especially near shore in Eastern Europe) and ensuring process solidity. Offshore providers inv estment on the other side will be centered on creating front-end capabilities with a focus on specific country and vertical-oriented competencies. While these global delivery models mature and are refined/ optimized, customers satisfaction will remain a challenge. KEY TRENDS SHAPING IT OUTSOURCING MARKET IN EUROPE TRENDS CHARACTERISTICS Selective Outsourcing With Multiple Providers * Embraced by majority of European companies * Objectives: IT excellence and cost optimization * Integration and governance challenges Global Sourcing and Global Delivery Models * Near shore proximity key for European market * Expanding portfolio of outsourcing services * Key area of investment for providers and buyers IT Utility * Industrialization is accelerating * Convergence of IT utility and global delivery * Key drivers: flexibility, efficiency, optimized cost, speed Aggressive ESP Competitive Landscape * National, global and offshore ESPs converging * Mergers, acquisitions and divestitures to continue * Providers are implementing new business models * New offshore market entrants Application Outsourcing to Grow * Drivers: portfolio rationalization, legacy modernization * Global delivery will gain acceptance * Multitude of providers competing Source: Gartner The U.K., Netherlands, Sweden and Finland are examples of countries more attracted by the global delivery model. However, in the meantime, the impact of global competition has started to drive countries such as Germany and France to consider global delivery as a viable option to be considered strategically, rather than when all other options have been exhausted. Despite a slower gestation and the fact that a complete infrastructure utility (IU) offering has not yet been developed, the IU model is continuing to attract new offerings and/or new providers. In the meantime, European customers, attracted by the idea of being able to access IT services in a flexible way, remain cautious as they expect further clarity on issues such as unit definition, pricing mechanisms, integration to existing systems, and security portability In the near future, we expect that the IU for ERP platforms will remain the most common battleground for providers; other providers are expected to instead mask their IU offering behind a package that includes product and support services. The concept of software as a service (SaaS) or ready-to-use applications will continue to generate lot of interest. Expectations for a solid delivery and specific functionalities will drive providers to specialize their offerings. Finally, gains in terms of process efficiency will be seen as crucial to deliver enhanced competitiveness, flexibility, agility and cost optimization. GLOBAL TRENDS: IT OUTSOURCING and OFFSHORING MARKET IT Outsourcing market is showing an average growth of 9% p.a. IT Outsourcing Worldwide forecast (Million $) Source : Gartner Dataquest In terms of volume, North America continues to be the leader in IT outsourcing. Latin America and APAC have shown good growth Europe has fast emerged as a big IT outsourcer Global offshore spending is continuing to register double digit growth. Worldwide Offshore IT Services Spending by Importing Region (million $) Source: Gartner Dataquest, 2004 and Worldwide and U.S. Offshore IT Services 2006-2010 Forecast In terms of volume, the North America continues to be the leader in IT offshoring. Once averse to the idea of outsourcing, Europe is now steadily adopting an IT offshore model to boost the economy Global offshore spending is projected to increase to 29400 $ Million in 2010 The graph on the next page shows the potential market for various types of sourcing options. This clearly depicts that he IT and Business Process offshoring market has grown at a tremendous rate over the past 7 year and the market provides a huge potential which is yet to be exploited. IT and BPO market Source Gartner, Dataquest, Aberdeen Group, McKinsey, Evalueserve, Infosys, IDC and Nasscom strategic review 2008 Currently we are not even exploiting 10% of the potential market size ( IT services off shoring just at $17 Billion, whereas market potential is about $200-240 Billion *) According to a new research by Gartner, the market is likely to grow further after the financial slowdown, as firms will try aggressively to reduce costs and improve efficiency Different Sourcing Models In-sourcing / Shared Services: Sourcing from internal sources or from an affiliated firm in the home economy Onshore Outsourcing: Sourcing from a non-affiliated firm in the home economy Captive Offshoring: Sourcing from an affiliated firm located abroad Offshore Outsourcing: Sourcing from a non-affiliated firm located abroad REGIONAL DYNAMICS ACROSS EUROPE The following section will describe the regional ITO trends and local dynamics across different European locations. UK and IRELAND 2005: â‚ ¬17.2B 2010: â‚ ¬25.7B 2005-2010 CAGR: 8.3% ITO drivers: Improve IT quality for end users, speed/flexibility, access to technical skills, cost reduction Inhibitors: Loss of control, lack of trust, security/privacy, IP Key trends: †¢ Most mature market in Europe with wider number of mega deals (public sector) †¢ Deal sophistication, including government. Increasing interest in new pricing schemes, business enhancement and shared services †¢ More selective sourcing and global delivery †¢ Areas such as Scotland and Ireland feeling pressure of Indian and Eastern European operations †¢ Wide potential for application engagements to mature from project engagements into outsourcing based engagements Despite being the largest and most mature market in Europe, the U.K. remains also one of the fast-growing ones. Here organizations seem to have moved away from the equation of outsourcing = cost reduction. While cost remains a key component, other objectives seem more important, such as improving IT service delivery, gaining specific skills, especially for application outsourcing deals, and becoming a more flexible organization. (See Appendix F) Inhibitions remain related to a general lack of trust in the ability to join forces with the providers to manage security, control over IT operations and IP. The U.K. market is characterized by a large number of mega deals, especially in the public sector. These outsourcing deals often include initiatives that have classically been carried out through project engagements and now are increasingly being performed in the initial phases of an IT outsourcing or BPO deal. This change reflects the growing desire of customers for a tighter link between investment and results (for which the outsourcer is responsible during the duration of the contract) and the important shift in role for the internal IT department. Rather than focusing on assembling and managing all of the necessary skills and capabilities to meet a certain objective, IT organizations, in this scenario, are responsible for coordinating the objectives of the Business Unit and the internal and external providers engaged to support them. Often infrastructure outsourcing is at the core of these complex relationships. At the same time, the U.K. is also the largest market in terms of adoption of IT services delivered through a network of global delivery capabilities (which include nearshore and offshore locations). From this point of view, areas that used to be considered as low cost for outsourcing operations (Scotland and Ireland) continue to feel the pressure of Indian and Eastern European capabilities. Finally, organizations that have engaged for a long period of time in project-based application deals are planning to elevate them into more-strategic, long-term application management engagements. This will allow them to gain a longer-term commitment from the service provider and the relevant support to re-evaluate their application portfolio. NORDIC COUNTRIES 2005: â‚ ¬5.2B 2010: â‚ ¬7.6B 2005-2010 CAGR: 8.2% Drivers: Cost reduction, access to technical skills (especially in application outsourcing engagements), support in global operations, focus on core business Inhibitors: Loss of control, security/privacy, lack of trust Key trends: †¢ Nordic market generally mature. Many large deals are in second or third generation. Some likely to evolve toward multi sourcing †¢ Large corporations see global delivery as a viable option. SMBs see nearshore option more favorably †¢ Consolidation drives specialization by geography, vertical market or horizontal service †¢ Increased competition between regional and global ESPs †¢ Cultural affinity seen as crucial to guarantee deal success/longevity Each of the four country markets that compose the Nordic region has its own distinct characteristics and buying behaviors in IT services. However, if we look at the forecast growth between 2005 and 2010, we expect the region to grow at a similar speed (despite size differences) of about 8%. Denmark: Sometimes seen as the entry point for the global service providers to the Nordics. Expected growth is from â‚ ¬856 million in 2005 to â‚ ¬1.2 billion in 2010 (CAGR of 7.8%). Finland: Unique in the Nordic region as buyers focus much more on business value of an outsourcing deal rather than just cost. Expected growth is from â‚ ¬1 billion in 2005 to â‚ ¬1.45 billion in 2010 (CAGR of 7.5%) Norway: Remains the smallest outsourcing market in the region. Expected growth is from â‚ ¬1.2 billion in 2005 to â‚ ¬1.8 billion in 2010 (CAGR of 8.1%) Sweden: Largest market and very cost-competitive. Probably the Nordic country targeted most by offshore providers currently. Expected growth is from â‚ ¬2 billion in 2005 to â‚ ¬3.1 billion in 2010 (CGR of 8.7%) From a client perspective, the Nordic region market is generally mature, with many large corporations in second- or third-generation outsourcing deals. Global delivery is widely accepted as an option. Competition between regional providers and global providers is increasing; this was initiated by the inability of local providers to support the operations of key Nordic organizations around the globe. However, recent acquisitions and divestitures by both local and international providers prove that the market has still got room for further maturation and consolidation. NETHERLANDS 2005: â‚ ¬3.4B 2010:â‚ ¬5B 2005-2010 CAGR: 8% Drivers: Cost reduction above all, agility/flexibility, improving service to end users Inhibitors: Loss of IP and control, security/privacy, high cost Key trends: Market shows mixed signs of maturity (organizations accept global delivery) and immaturity (sourcing strategy is often neglected) Market split between large global corporations and wide portion of SMBs Increased competition for local/national champions Application under scrutiny for externalization The market in the Netherlands is one of the more modern IT outsourcing environments in Europe, closely following the U.K. in many trends. A focus on global delivery and the expansion of many deals into the application or business process layer points to more market maturity. This maturity is driven primarily by the relatively high proportion of large (and often multinational) enterprises headquartered in the Netherlands and competing in major markets such as financial services. But there are some contradictory characteristics that point to an immature market (cost cutting is by far the major driver, and sourcing strategy is often neglected); this, as a consequence, often inhibits the potential success of outsourcing initiatives. The market remains very challenging and competitive. This is due to the high presence of small and midsize businesses (SMBs), which traditionally tend to consider outsourcing as a threat more than an opportunity and require a higher level of customization, which tests the profitability model of service providers. Competition remains strong for national champions as global and offshore providers continue to target opportunities in the country. Increasingly, application outsourcing opportunities are emerging as organizations look at portfolio rationalization, legacy system transformation, and custom application software development initiatives and accessing application utility solutions. FRANCE 2005: â‚ ¬6.6B 2010: â‚ ¬10B CAGR: 8.4 % Drivers: Cost reduction, refocus internal IT, speed/flexibility Inhibitors: Loss of control, lack of trust, security/privacy Key trends: Beyond its reliance on staff augmentation, Frances outsourcing market shows opportunities in all facets of outsourcing: infrastructure, applications and BPO Selective outsourcing has gained acceptance, and organizations show cautious interest in global service delivery National champions remain under competitive pressure from the global and multinational providers France has long been considered behind in the outsourcing trend. Now, however, the French outsourcing market is consolidating and growing, while the long-standing reliance on staff augmentation is losing strength. The major driver that will support a CAGR of over 8% between 2005 and 2010 is the need for French organizations to reduce cost and enhance their level of competitiveness in the market by refocusing their internal IT skills on more-strategic tasks while gaining flexibility. On the other side, it is interesting to see that challenges related to HR management have lost strength, compared with the traditional fears related to loss of control and security and lack of trust. Large organizations have recently moved toward the adoption of selective outsourcing with multiple providers. This model has gained acceptance as organizations look at maximizing the balance between cost and service delivery excellence. There is also a new focus on application outsourcing. This trend is important not only because it signals an acceleration in the growth of outsourcing in France overall, but because it signals a major change in the way French organizations use different kinds of IT services. Increase in application outsourcing deals also touches on one of the major taboos of IT services in France: offshore outsourcing. As such, although offshore remains a word to be used with extra care in the French market, many organizations would consider that access to global delivery models is an appealing part of outsourcing, especially when delivered by traditional players. In this case, North Africa (Morocco, for example) is emerging as a viable near shore location. National champions, the providers that focus on a specific region or country, remain under competitive pressure from the global and multinational providers. GERMANY 2005: â‚ ¬10.6B 2010: â‚ ¬16B 2005-2010 CAGR: 8.6% Drivers: Cost reduction above all, focus on core business, refocus internal IT Inhibitors: Security/privacy, lack of trust, loss of control Key trends: Global economic pressures have forced many organizations to look at outsourcing as a viable option In the short term, objectives such as flexibility and agility are secondary Pressure to divest internal IT departments or internal shared service organizations remains strong Global delivery gaining ground especially toward Eastern Europe Intensifying competition between strong German players and global ones Legacy system modernization will remain a key objective The German market is â€Å"federated† in several ways: government responsibilities, industrial centers, buying centers within enterprises, and management structures in place. All of this makes doing business in Germany (and negotiating significant IT service deals) unique. Decision processes tend to be longer, require more consensus building and often entail more travel than in other parts of Europe. For a long time, the majority of German organizations have considered IT operations as a key component to maintain or enhance their level of competitiveness in the market. This has, as a consequence, slowed the outsourcing growth. In the past two years, however, economic pressures have forced many organizations to look at outsourcing tactically to cut cost. While in the short term, achieving flexibility is a secondary objective, organizations look at outsourcing as a way to refocus their internal capabilities while focusing on their core business. The traditional inhibitors around security, trust and loss of control apply. While non-German external service providers (ESPs) still find it difficult to position themselves in Germany (exceptions are IBM Germany, which established itself early on as a â€Å"German† ESP, and HP, based on its early SAP hosting business and penetration as a technology provider), German providers maintain strong domestic positions and are starting to focus on expanding their international presence (through T-Systems). In the short term, German organizations will still consider selling their own IT capabilities, while global providers will see these as viable targets to build capabilities as long as they provide financial support through a long-term outsourcing deal. Finally, beyond potential healthy growth for ERP application outsourcing initiatives (especially SAP), as many organizations look at legacy system modernization, it is likely that many projects will evolve and deploy model to include the long-term management of applications. EASTERN EUROPE 2005: â‚ ¬1.1B 2010: â‚ ¬1.6B 2005-2010 CAGR: 7.9% Drivers: Acquisitions made by large Western European organizations, increased competition, need to revamp obsolete IT environments (leap-frog) Inhibitors: Low expertise to manage OS deals, high cost of OS, loss of control Key trends: Slow internal consumption of outsourcing Key nearshore delivery hub for providers supporting operations of European organizations Local Eastern European service providers will remain target for acquisitions Long-term growth will be supported by increasing competition, acquisitions made by Western companies and the penetration of Western ESPs in the region The region has become a strong global delivery hub Recent admission to the European Union has transformed countries such as Poland, Romania and the Czech Republic into attractive locations to establish global delivery capabilities designed to deliver IT services to European or global customers. Eastern Europe has been identified as an ideal region to establish a service delivery hub by U.S.-based providers (IBM, Accenture and EDS), European ones (Atos Origin, Capgemini, T-Systems, SIS and ST) and offshore ones (Ness, TCS, Satyam, Infosys and Wipro). When necessary, providers are openly seeking acquisitions to gain scale; it is the case for SIS, which acquired ELAS, HT Computers in Slovakia, and Ibis-Sys in Serbia (February 2005). Others, like Austrian-based ST, are pursuing a strategy of becoming the provider of choice in Eastern Europe through a combination of organic development and local acquisitions. ST acquired Computacenter Austria to strengthen its product resale capabilities. Although internal consumption of outsourcing has been slow, it is expected to grow rapidly, thanks to increasing competition driven by the fact that private-sector companies and public-sector organizations are now focusing on bringing their systems into line with market standards. This is leading to some â€Å"leapfrogging† effects — the IT utility approach, for example, holds significant appeal without posing the same transition challenges as elsewhere — but because these markets are fairly immature, there is still a strong focus on products and product support services rather than more-sophisticated IT service engagements. Italy and Spain are two other major countries with an expected ITO market size of about 5 Billion $ each by the year 2010. GLOBAL DELIVERY MODEL GDM is a unique approach to outsourcing and off shoring, which offers the best of both worlds by blending onsite, onshore and offshore resources and locations. By using a far-reaching network of onsite, onshore, and offshore resources, GDM aims to cuts across geographies to access the right resources, in the right place, at the right cost. By selecting the most advantageous and cost effective proportion of resources worldwide, Global Delivery Model boosts business performance while also lowering costs. It also helps the supplier deliver requirements that are met on-time, within budget, and with high quality; greater efficiency and responsiveness to their clients. In this section we would discuss in detail, the key drivers to a successful GDM. Source: Capgemini, 2008 KEY DRIVERS OF A SUCCESSFUL GDM STRONG PROCESSES Strong processes are the backbone of a successful Global Delivery Model. There is a strong need for detailed, documented and time-tested processes for all the activities and interfaces. Strong quality and project management processes ensure delivery excellence. World class processes for knowledge management and sharing resources encourage improved learning among teams. Processes for managing talent ensure that the projects get the best and most motivated people. Strong processes for interaction and communication within team make it possible for globally distributed groups to interface and collaborate in an effective manner while delivering excellence on a continuous basis. On the other hand, processes, while strong, should leave ample space for creativity and flexibility. It is only then that the Global Delivery Model (GDM) can create far more value than the traditional sourcing models. Here is what it will translate into: Quicker, seamless transitions, and early project ownership Optimum onsite/ offshore mixes through intelligent allocation of the available resources High degree of predictability through processes, sharing and reuse A strong relationship approach to ensure continuity and business focus Sharing of best practices and tools across the enterprise Depth and quality of resources, continuously trained and retrained to suit project needs Adherence to SLA based pricing models to ensure good Return on Investment (ROI) and drive customer satisfaction PROCESS ARCHITECTURE Companies rely on processes to consistently deliver high quality solutions while executing a number of engagements from multiple locations. According to the policies adopted by a leading IT services provider: values, vision and policies should form the first level of the three-tiered process architecture. These are then implemented through process execution at the next level. These processes are defined with clear ownership and clearly defined roles and responsibilities. Quality System Documentation Quality System Documentation defines clearly all the processes that should be put into place. These documents provide the engineers and consultations with a vast repository of detailed procedures, templates, standards, guidelines and checklists. The comprehensiveness of these documents supports all tasks from higher-level information abstraction and definition to tasks such as coding and documentation. This is crucial to assure clients with the delivery of high quality and predictable IT solutions that meet their business needs. These documents should also be monitored and updated regularly. Knowledge Sharing Employees are given a forum like a website portal, to share knowledge gained from their experience at the organization. It is meant to be a central repository of the knowledge that can be tapped by peers and as sometimes external clients as well. The collection of documents on this portal is reviewed and classified into different areas: Software development life-cycle activities such as requirements specification, design, build and testing documentation. Software-related topics such as tools and quality documentation. Topics of general or operational interest such as travel or HR policies, etc. Process Assets This is a repository to facilitate sharing and giving out of engagement learning across the organization. The user has the facility to submit to the repository, retrieve from the repository and obtain information on the status of the repository. A process asset can be any information ranging from an engagement, which can be re-used by future engagements. Typically these include project plans, configuration management plans, requirements documents, standards, checklists, design documents, test plans, causal analysis reports and utilities used in the engagement, etc. Process Database The Process Database is a software engineering database to study the processes at the organization with respect to productivity and quality. More specifically, its purpose areas are as follows: To aid estimation of effort and project defects To get the productivity and quality data on different types of projects To aid in creating of a process capability baseline Process Capability Baseline (PCB) Process Capability baseline is used to specify, what the performance of the process is, i.e. what a project can expect when following the process. This estimation is done based on the past data. The performance factors of the process are Effect of Globalization on IT Service Providers in Europe Effect of Globalization on IT Service Providers in Europe Opportunities and challenges presented by Globalization: IT Service providers in Continental Europe EXECUTIVE SUMMARY Enterprises within Europe are increasingly trying to seek the advantages of global sourcing. Unlike enterprises in U.S. or U.K., continental European countries have historically been reluctant to engage with offshore providers. The reasons were far stretched, ranging from political sensitivity, labor laws, cultural compatibility and language requirements. Globalization, however, is creating new avenues that European companies can not ignore. A recent report by Gartner shows the potential IT Offshoring market to be in the range of about $ 200 to 240 Billion. The market is expected to register double digit growth for years to come. The current offshore spending by firms amounts to just $17 Billion worldwide. This clearly shows a big gap, a huge market potential which is yet to be exploited. The huge demand has also led to emergence and growth of several new players in the field of IT Outsourcing/ Offshoring services, this is leading to ever increasing competition in the marketplace. In order to cope up with this increased competition and to provide better services, these service providers are increasingly adopting Global delivery models. By selecting an advantageous and cost effective proportion of resources worldwide, Global Delivery Model boosts business performance while also lowering costs. It also helps the supplier deliver requirements that are met on-time, within budget, and with high quality; greater efficiency and responsiveness to their clients. In Europe, nearshore models still dominate the market. But these models are continuously being updated, with more and more providers setting up Offshore Development Centers in locations like India. A framework for building an optimal combination of onsite, nearshore, and offshore delivery capabilities is provided by Capgeminis Rightshore ® model. A recent Gartner report has suggested that, the current US economic slowdown is expected to lead buyers of IT services to consider increasing the percentage of their labor in offshore locations. India will remain the dominant location for IT offshore services for North American and European buyers as a result of its scale, quality of resources and strong presence of local and traditional service providers. INTRODUCTION: EUROPEAN IT MARKET The European market remains a highly complex and competitive market with a large number of providers. Mergers and acquisitions will continue but will be balanced by new market entrants Outsourcing adoption in Europe is increasing for both infrastructure and applications; the widespread lack of well defined sourcing strategies among buyers and the realities of ever-changing business requirements will generate frequent deal negotiations and renegotiations Global delivery and utility services are irreversible trends evolving at different speeds among various European countries. The European multi country, multi language/culture composition increases the evolutionary complexity of these trends Selective outsourcing with multiple providers will remain the preferred model of engagement for European buyers. Governance and end-to-end integration/management of different providers/solutions are the most challenging aspects of it ITO market maturity varies: UK is the most matured IT market in Europe. The other European markets are maturing at different speeds. An acceleration in ITO adoption is now apparent in countries such as France and Germany A focus on achieving service delivery excellence and the best value/quality balance is increasingly driving European organizations (especially those beyond the first generation deal) to consider selecting multiple providers for an outsourcing contract. For example, in the IT Telecom sector, the most common division is by service tower, with customers opting to choose different providers for their network, desktop, data center and application competencies. At the moment, however, providers tend to join forces in an opportunistic manner, as a response to customer demands. This is the cause behind the ever-changing composition of the providers teams; as a consequence, consolidating best practices to manage IT service spin offs between different providers in an effort to guarantee end-to-end service delivery excellence remains challenging. As the number of providers engaged is set to increase, this challenge is likely to intensify. It will also be driven by other market characteristics, which include a persistent tactical use of outsourcing by European customers, insufficient process maturity, and lack of clarity in the definition of roles and responsibilities. As we look at global delivery, it is fair to say that there are two major misconceptions that still exist among the European market: 1) Global delivery is often considered as a synonym of offshore, and 2) IT services delivered through global delivery capabilities are application services. In reality, in the past few years, the European market has witnessed a considerable expansion in terms of both geographical location options (in areas such as Eastern Europe or North Africa, for example) and portfolio of services offered (now including, for example, help desk and remote infrastructure management services). Global delivery and offshore, however, remain the key deal characteristics that need to be treated with extra care in many European geographies, and as a consequence, many deals remain confidential. Traditional providers investment will be directed toward enhancing existing capabilities (especially near shore in Eastern Europe) and ensuring process solidity. Offshore providers inv estment on the other side will be centered on creating front-end capabilities with a focus on specific country and vertical-oriented competencies. While these global delivery models mature and are refined/ optimized, customers satisfaction will remain a challenge. KEY TRENDS SHAPING IT OUTSOURCING MARKET IN EUROPE TRENDS CHARACTERISTICS Selective Outsourcing With Multiple Providers * Embraced by majority of European companies * Objectives: IT excellence and cost optimization * Integration and governance challenges Global Sourcing and Global Delivery Models * Near shore proximity key for European market * Expanding portfolio of outsourcing services * Key area of investment for providers and buyers IT Utility * Industrialization is accelerating * Convergence of IT utility and global delivery * Key drivers: flexibility, efficiency, optimized cost, speed Aggressive ESP Competitive Landscape * National, global and offshore ESPs converging * Mergers, acquisitions and divestitures to continue * Providers are implementing new business models * New offshore market entrants Application Outsourcing to Grow * Drivers: portfolio rationalization, legacy modernization * Global delivery will gain acceptance * Multitude of providers competing Source: Gartner The U.K., Netherlands, Sweden and Finland are examples of countries more attracted by the global delivery model. However, in the meantime, the impact of global competition has started to drive countries such as Germany and France to consider global delivery as a viable option to be considered strategically, rather than when all other options have been exhausted. Despite a slower gestation and the fact that a complete infrastructure utility (IU) offering has not yet been developed, the IU model is continuing to attract new offerings and/or new providers. In the meantime, European customers, attracted by the idea of being able to access IT services in a flexible way, remain cautious as they expect further clarity on issues such as unit definition, pricing mechanisms, integration to existing systems, and security portability In the near future, we expect that the IU for ERP platforms will remain the most common battleground for providers; other providers are expected to instead mask their IU offering behind a package that includes product and support services. The concept of software as a service (SaaS) or ready-to-use applications will continue to generate lot of interest. Expectations for a solid delivery and specific functionalities will drive providers to specialize their offerings. Finally, gains in terms of process efficiency will be seen as crucial to deliver enhanced competitiveness, flexibility, agility and cost optimization. GLOBAL TRENDS: IT OUTSOURCING and OFFSHORING MARKET IT Outsourcing market is showing an average growth of 9% p.a. IT Outsourcing Worldwide forecast (Million $) Source : Gartner Dataquest In terms of volume, North America continues to be the leader in IT outsourcing. Latin America and APAC have shown good growth Europe has fast emerged as a big IT outsourcer Global offshore spending is continuing to register double digit growth. Worldwide Offshore IT Services Spending by Importing Region (million $) Source: Gartner Dataquest, 2004 and Worldwide and U.S. Offshore IT Services 2006-2010 Forecast In terms of volume, the North America continues to be the leader in IT offshoring. Once averse to the idea of outsourcing, Europe is now steadily adopting an IT offshore model to boost the economy Global offshore spending is projected to increase to 29400 $ Million in 2010 The graph on the next page shows the potential market for various types of sourcing options. This clearly depicts that he IT and Business Process offshoring market has grown at a tremendous rate over the past 7 year and the market provides a huge potential which is yet to be exploited. IT and BPO market Source Gartner, Dataquest, Aberdeen Group, McKinsey, Evalueserve, Infosys, IDC and Nasscom strategic review 2008 Currently we are not even exploiting 10% of the potential market size ( IT services off shoring just at $17 Billion, whereas market potential is about $200-240 Billion *) According to a new research by Gartner, the market is likely to grow further after the financial slowdown, as firms will try aggressively to reduce costs and improve efficiency Different Sourcing Models In-sourcing / Shared Services: Sourcing from internal sources or from an affiliated firm in the home economy Onshore Outsourcing: Sourcing from a non-affiliated firm in the home economy Captive Offshoring: Sourcing from an affiliated firm located abroad Offshore Outsourcing: Sourcing from a non-affiliated firm located abroad REGIONAL DYNAMICS ACROSS EUROPE The following section will describe the regional ITO trends and local dynamics across different European locations. UK and IRELAND 2005: â‚ ¬17.2B 2010: â‚ ¬25.7B 2005-2010 CAGR: 8.3% ITO drivers: Improve IT quality for end users, speed/flexibility, access to technical skills, cost reduction Inhibitors: Loss of control, lack of trust, security/privacy, IP Key trends: †¢ Most mature market in Europe with wider number of mega deals (public sector) †¢ Deal sophistication, including government. Increasing interest in new pricing schemes, business enhancement and shared services †¢ More selective sourcing and global delivery †¢ Areas such as Scotland and Ireland feeling pressure of Indian and Eastern European operations †¢ Wide potential for application engagements to mature from project engagements into outsourcing based engagements Despite being the largest and most mature market in Europe, the U.K. remains also one of the fast-growing ones. Here organizations seem to have moved away from the equation of outsourcing = cost reduction. While cost remains a key component, other objectives seem more important, such as improving IT service delivery, gaining specific skills, especially for application outsourcing deals, and becoming a more flexible organization. (See Appendix F) Inhibitions remain related to a general lack of trust in the ability to join forces with the providers to manage security, control over IT operations and IP. The U.K. market is characterized by a large number of mega deals, especially in the public sector. These outsourcing deals often include initiatives that have classically been carried out through project engagements and now are increasingly being performed in the initial phases of an IT outsourcing or BPO deal. This change reflects the growing desire of customers for a tighter link between investment and results (for which the outsourcer is responsible during the duration of the contract) and the important shift in role for the internal IT department. Rather than focusing on assembling and managing all of the necessary skills and capabilities to meet a certain objective, IT organizations, in this scenario, are responsible for coordinating the objectives of the Business Unit and the internal and external providers engaged to support them. Often infrastructure outsourcing is at the core of these complex relationships. At the same time, the U.K. is also the largest market in terms of adoption of IT services delivered through a network of global delivery capabilities (which include nearshore and offshore locations). From this point of view, areas that used to be considered as low cost for outsourcing operations (Scotland and Ireland) continue to feel the pressure of Indian and Eastern European capabilities. Finally, organizations that have engaged for a long period of time in project-based application deals are planning to elevate them into more-strategic, long-term application management engagements. This will allow them to gain a longer-term commitment from the service provider and the relevant support to re-evaluate their application portfolio. NORDIC COUNTRIES 2005: â‚ ¬5.2B 2010: â‚ ¬7.6B 2005-2010 CAGR: 8.2% Drivers: Cost reduction, access to technical skills (especially in application outsourcing engagements), support in global operations, focus on core business Inhibitors: Loss of control, security/privacy, lack of trust Key trends: †¢ Nordic market generally mature. Many large deals are in second or third generation. Some likely to evolve toward multi sourcing †¢ Large corporations see global delivery as a viable option. SMBs see nearshore option more favorably †¢ Consolidation drives specialization by geography, vertical market or horizontal service †¢ Increased competition between regional and global ESPs †¢ Cultural affinity seen as crucial to guarantee deal success/longevity Each of the four country markets that compose the Nordic region has its own distinct characteristics and buying behaviors in IT services. However, if we look at the forecast growth between 2005 and 2010, we expect the region to grow at a similar speed (despite size differences) of about 8%. Denmark: Sometimes seen as the entry point for the global service providers to the Nordics. Expected growth is from â‚ ¬856 million in 2005 to â‚ ¬1.2 billion in 2010 (CAGR of 7.8%). Finland: Unique in the Nordic region as buyers focus much more on business value of an outsourcing deal rather than just cost. Expected growth is from â‚ ¬1 billion in 2005 to â‚ ¬1.45 billion in 2010 (CAGR of 7.5%) Norway: Remains the smallest outsourcing market in the region. Expected growth is from â‚ ¬1.2 billion in 2005 to â‚ ¬1.8 billion in 2010 (CAGR of 8.1%) Sweden: Largest market and very cost-competitive. Probably the Nordic country targeted most by offshore providers currently. Expected growth is from â‚ ¬2 billion in 2005 to â‚ ¬3.1 billion in 2010 (CGR of 8.7%) From a client perspective, the Nordic region market is generally mature, with many large corporations in second- or third-generation outsourcing deals. Global delivery is widely accepted as an option. Competition between regional providers and global providers is increasing; this was initiated by the inability of local providers to support the operations of key Nordic organizations around the globe. However, recent acquisitions and divestitures by both local and international providers prove that the market has still got room for further maturation and consolidation. NETHERLANDS 2005: â‚ ¬3.4B 2010:â‚ ¬5B 2005-2010 CAGR: 8% Drivers: Cost reduction above all, agility/flexibility, improving service to end users Inhibitors: Loss of IP and control, security/privacy, high cost Key trends: Market shows mixed signs of maturity (organizations accept global delivery) and immaturity (sourcing strategy is often neglected) Market split between large global corporations and wide portion of SMBs Increased competition for local/national champions Application under scrutiny for externalization The market in the Netherlands is one of the more modern IT outsourcing environments in Europe, closely following the U.K. in many trends. A focus on global delivery and the expansion of many deals into the application or business process layer points to more market maturity. This maturity is driven primarily by the relatively high proportion of large (and often multinational) enterprises headquartered in the Netherlands and competing in major markets such as financial services. But there are some contradictory characteristics that point to an immature market (cost cutting is by far the major driver, and sourcing strategy is often neglected); this, as a consequence, often inhibits the potential success of outsourcing initiatives. The market remains very challenging and competitive. This is due to the high presence of small and midsize businesses (SMBs), which traditionally tend to consider outsourcing as a threat more than an opportunity and require a higher level of customization, which tests the profitability model of service providers. Competition remains strong for national champions as global and offshore providers continue to target opportunities in the country. Increasingly, application outsourcing opportunities are emerging as organizations look at portfolio rationalization, legacy system transformation, and custom application software development initiatives and accessing application utility solutions. FRANCE 2005: â‚ ¬6.6B 2010: â‚ ¬10B CAGR: 8.4 % Drivers: Cost reduction, refocus internal IT, speed/flexibility Inhibitors: Loss of control, lack of trust, security/privacy Key trends: Beyond its reliance on staff augmentation, Frances outsourcing market shows opportunities in all facets of outsourcing: infrastructure, applications and BPO Selective outsourcing has gained acceptance, and organizations show cautious interest in global service delivery National champions remain under competitive pressure from the global and multinational providers France has long been considered behind in the outsourcing trend. Now, however, the French outsourcing market is consolidating and growing, while the long-standing reliance on staff augmentation is losing strength. The major driver that will support a CAGR of over 8% between 2005 and 2010 is the need for French organizations to reduce cost and enhance their level of competitiveness in the market by refocusing their internal IT skills on more-strategic tasks while gaining flexibility. On the other side, it is interesting to see that challenges related to HR management have lost strength, compared with the traditional fears related to loss of control and security and lack of trust. Large organizations have recently moved toward the adoption of selective outsourcing with multiple providers. This model has gained acceptance as organizations look at maximizing the balance between cost and service delivery excellence. There is also a new focus on application outsourcing. This trend is important not only because it signals an acceleration in the growth of outsourcing in France overall, but because it signals a major change in the way French organizations use different kinds of IT services. Increase in application outsourcing deals also touches on one of the major taboos of IT services in France: offshore outsourcing. As such, although offshore remains a word to be used with extra care in the French market, many organizations would consider that access to global delivery models is an appealing part of outsourcing, especially when delivered by traditional players. In this case, North Africa (Morocco, for example) is emerging as a viable near shore location. National champions, the providers that focus on a specific region or country, remain under competitive pressure from the global and multinational providers. GERMANY 2005: â‚ ¬10.6B 2010: â‚ ¬16B 2005-2010 CAGR: 8.6% Drivers: Cost reduction above all, focus on core business, refocus internal IT Inhibitors: Security/privacy, lack of trust, loss of control Key trends: Global economic pressures have forced many organizations to look at outsourcing as a viable option In the short term, objectives such as flexibility and agility are secondary Pressure to divest internal IT departments or internal shared service organizations remains strong Global delivery gaining ground especially toward Eastern Europe Intensifying competition between strong German players and global ones Legacy system modernization will remain a key objective The German market is â€Å"federated† in several ways: government responsibilities, industrial centers, buying centers within enterprises, and management structures in place. All of this makes doing business in Germany (and negotiating significant IT service deals) unique. Decision processes tend to be longer, require more consensus building and often entail more travel than in other parts of Europe. For a long time, the majority of German organizations have considered IT operations as a key component to maintain or enhance their level of competitiveness in the market. This has, as a consequence, slowed the outsourcing growth. In the past two years, however, economic pressures have forced many organizations to look at outsourcing tactically to cut cost. While in the short term, achieving flexibility is a secondary objective, organizations look at outsourcing as a way to refocus their internal capabilities while focusing on their core business. The traditional inhibitors around security, trust and loss of control apply. While non-German external service providers (ESPs) still find it difficult to position themselves in Germany (exceptions are IBM Germany, which established itself early on as a â€Å"German† ESP, and HP, based on its early SAP hosting business and penetration as a technology provider), German providers maintain strong domestic positions and are starting to focus on expanding their international presence (through T-Systems). In the short term, German organizations will still consider selling their own IT capabilities, while global providers will see these as viable targets to build capabilities as long as they provide financial support through a long-term outsourcing deal. Finally, beyond potential healthy growth for ERP application outsourcing initiatives (especially SAP), as many organizations look at legacy system modernization, it is likely that many projects will evolve and deploy model to include the long-term management of applications. EASTERN EUROPE 2005: â‚ ¬1.1B 2010: â‚ ¬1.6B 2005-2010 CAGR: 7.9% Drivers: Acquisitions made by large Western European organizations, increased competition, need to revamp obsolete IT environments (leap-frog) Inhibitors: Low expertise to manage OS deals, high cost of OS, loss of control Key trends: Slow internal consumption of outsourcing Key nearshore delivery hub for providers supporting operations of European organizations Local Eastern European service providers will remain target for acquisitions Long-term growth will be supported by increasing competition, acquisitions made by Western companies and the penetration of Western ESPs in the region The region has become a strong global delivery hub Recent admission to the European Union has transformed countries such as Poland, Romania and the Czech Republic into attractive locations to establish global delivery capabilities designed to deliver IT services to European or global customers. Eastern Europe has been identified as an ideal region to establish a service delivery hub by U.S.-based providers (IBM, Accenture and EDS), European ones (Atos Origin, Capgemini, T-Systems, SIS and ST) and offshore ones (Ness, TCS, Satyam, Infosys and Wipro). When necessary, providers are openly seeking acquisitions to gain scale; it is the case for SIS, which acquired ELAS, HT Computers in Slovakia, and Ibis-Sys in Serbia (February 2005). Others, like Austrian-based ST, are pursuing a strategy of becoming the provider of choice in Eastern Europe through a combination of organic development and local acquisitions. ST acquired Computacenter Austria to strengthen its product resale capabilities. Although internal consumption of outsourcing has been slow, it is expected to grow rapidly, thanks to increasing competition driven by the fact that private-sector companies and public-sector organizations are now focusing on bringing their systems into line with market standards. This is leading to some â€Å"leapfrogging† effects — the IT utility approach, for example, holds significant appeal without posing the same transition challenges as elsewhere — but because these markets are fairly immature, there is still a strong focus on products and product support services rather than more-sophisticated IT service engagements. Italy and Spain are two other major countries with an expected ITO market size of about 5 Billion $ each by the year 2010. GLOBAL DELIVERY MODEL GDM is a unique approach to outsourcing and off shoring, which offers the best of both worlds by blending onsite, onshore and offshore resources and locations. By using a far-reaching network of onsite, onshore, and offshore resources, GDM aims to cuts across geographies to access the right resources, in the right place, at the right cost. By selecting the most advantageous and cost effective proportion of resources worldwide, Global Delivery Model boosts business performance while also lowering costs. It also helps the supplier deliver requirements that are met on-time, within budget, and with high quality; greater efficiency and responsiveness to their clients. In this section we would discuss in detail, the key drivers to a successful GDM. Source: Capgemini, 2008 KEY DRIVERS OF A SUCCESSFUL GDM STRONG PROCESSES Strong processes are the backbone of a successful Global Delivery Model. There is a strong need for detailed, documented and time-tested processes for all the activities and interfaces. Strong quality and project management processes ensure delivery excellence. World class processes for knowledge management and sharing resources encourage improved learning among teams. Processes for managing talent ensure that the projects get the best and most motivated people. Strong processes for interaction and communication within team make it possible for globally distributed groups to interface and collaborate in an effective manner while delivering excellence on a continuous basis. On the other hand, processes, while strong, should leave ample space for creativity and flexibility. It is only then that the Global Delivery Model (GDM) can create far more value than the traditional sourcing models. Here is what it will translate into: Quicker, seamless transitions, and early project ownership Optimum onsite/ offshore mixes through intelligent allocation of the available resources High degree of predictability through processes, sharing and reuse A strong relationship approach to ensure continuity and business focus Sharing of best practices and tools across the enterprise Depth and quality of resources, continuously trained and retrained to suit project needs Adherence to SLA based pricing models to ensure good Return on Investment (ROI) and drive customer satisfaction PROCESS ARCHITECTURE Companies rely on processes to consistently deliver high quality solutions while executing a number of engagements from multiple locations. According to the policies adopted by a leading IT services provider: values, vision and policies should form the first level of the three-tiered process architecture. These are then implemented through process execution at the next level. These processes are defined with clear ownership and clearly defined roles and responsibilities. Quality System Documentation Quality System Documentation defines clearly all the processes that should be put into place. These documents provide the engineers and consultations with a vast repository of detailed procedures, templates, standards, guidelines and checklists. The comprehensiveness of these documents supports all tasks from higher-level information abstraction and definition to tasks such as coding and documentation. This is crucial to assure clients with the delivery of high quality and predictable IT solutions that meet their business needs. These documents should also be monitored and updated regularly. Knowledge Sharing Employees are given a forum like a website portal, to share knowledge gained from their experience at the organization. It is meant to be a central repository of the knowledge that can be tapped by peers and as sometimes external clients as well. The collection of documents on this portal is reviewed and classified into different areas: Software development life-cycle activities such as requirements specification, design, build and testing documentation. Software-related topics such as tools and quality documentation. Topics of general or operational interest such as travel or HR policies, etc. Process Assets This is a repository to facilitate sharing and giving out of engagement learning across the organization. The user has the facility to submit to the repository, retrieve from the repository and obtain information on the status of the repository. A process asset can be any information ranging from an engagement, which can be re-used by future engagements. Typically these include project plans, configuration management plans, requirements documents, standards, checklists, design documents, test plans, causal analysis reports and utilities used in the engagement, etc. Process Database The Process Database is a software engineering database to study the processes at the organization with respect to productivity and quality. More specifically, its purpose areas are as follows: To aid estimation of effort and project defects To get the productivity and quality data on different types of projects To aid in creating of a process capability baseline Process Capability Baseline (PCB) Process Capability baseline is used to specify, what the performance of the process is, i.e. what a project can expect when following the process. This estimation is done based on the past data. The performance factors of the process are