Recent orders

Explain What You Understand By UTM Coordinates And How They Are Derived

Explain What You Understand By UTM Coordinates And How They Are Derived

A Universal Transverse Mercator is an accurate projection in narrow zones. A map projection refers to any mathematical transformation of the globe onto some other surface, including many, which cannot be physically realized by any actual optical projection system. One of the widely and familiar used Mercator projection is the cylindrical projection. This projection is used because the world is encircled by an imaginary cylinder touching at the equator with the earth being projected onto the cylinder. This cylindrical projection is different from the cylindrical gnomonic in that the cylindrical projection is not what would result from placing a light at the center of the earth and projecting the surface onto the cylinder unlike the cylindrical gnomonic, which results in very extreme distortions especially in polar regions and therefore; has virtually no practical use. Since the Transverse Mercator is an accurate projection in narrow zones, it has become the common reference for the global coordinate system, which is known as the UTM system (Universal Transverse Mercator System). This system was developed to set the universal world system for mapping.

The world is subdivided into narrow longitude zones, which are projected onto a Transverse Mercator projection. A grid is constructed on the projection, and used to locate points. The upside of the grid system is that, since the grid is rectangular and decimal, it is far easier to use than latitude and longitude. The downside is that, unlike latitude and longitude, there is no way to determine grid locations independently. The Transverse Mercator Projection was used in sixty positions so as to create sixty zones around the world, with each zone six degrees in width. Positions on the earth are measured using Eastings and Northings, and are measured in meters rather than in degrees and minutes, which are the standard units of measurements for Latitudes and Longitudes. Eastings begin at 500,000 on the centre of the earth’s meridian of each zone, while Northings begin at 0 from the equator and increases moving pole wards.

The UTM system divides the earth into 60 zones that are each 60 longitude apart. These zones define the reference points for a UTM grid coordinate within a particular zone. UTM zones extend from latitude 800S to 840N. Within the Polar Regions, the Universal Polar Stereographic grid system becomes used. UTM zones are numbered from 1 to 60 starting at the International Date Line, longitude 1800, and then proceed east. Each zone is then divided into horizontal bands, which are 80latitude. These bands are assigned a letter from south to north beginning with C (omitting I and O in order to avoid confusion with numbers) and later ending with X, which is the latitudinal band and is the only exception that spans 120. When using UTM coordinates, these band letters are included in the description as well as the zone number.

In order to determine one’s location on the globe, one must be in the hemisphere and zone in which they are in. this is because coordinates will be identical from zone to zone without the zone number and zone grid letter. An example of a valid location is 15 S 0342911 E/4302262N, bearing in mind coordinates are given in terms of eastings and northings.

15 represent the UTM zone

S represents the UTM band

0342911E represent Latitude in meters

4302262N represent Longitude in meters

Reference

Garmin Corporation. (2005). Using a Garmin GPS with Paper Maps for Land Navigation. Kansas: Garmin International, Inc

Explain what working capital is and provide an equation that can be

Applied Managerial Finance

Name

Institution

Date

Applied Managerial Finance

Working capital is an accounting measure used to find out the financial efficiency, liquidity and the overall operation of a given company or simply the measure of how efficient the company is in relation to its financial growth in a short term time-frame. The reason to why working capital is used in determining the operations of the company, it is because working capital includes accounts payables, accounts receivable, inventory, cash and salaries payable. Working capital is thus the difference between current assets and current liabilities (McLeary, 2000).

1. Explain what working capital is and provide an equation that can beused to compute it.

The working capital concept plays an important role in financial management for a company as it is a term required for the operation of the company. For example, a merchant has goods which yields her no profit unless she gives them out for cash, and the cash brings her some profit till the cash is given out in exchange for goods. The working capital in accounting is necessary as it is required regularly for business operations. Such business operations includes the buying of raw materials, production activities, stock and store investment, paying of both direct and indirect expenditures, and inventories or credit given to customers who keep balance (Shweta, 2013).

In financial management, a statement (working capital) showing the flow of funds is calculated. The purpose of preparing the working capital statement is to illustrate its net effect on the flow of all funds in and out of the company over a period of time. Thus, the accounting equation can best be used to show the effect of funds flow on working capital. The equation therefore is:

Liabilities + Capital = Assets or

Current Assets – Current Liabilities = Working Capital

That is, L + C= A

The equation can further be expounded to: Current Assets + Long Term Assets = Current Liabilities + Long Term Liabilities + Capital

That is, CA +LTA = CL + LTL + C

The same equation can be read as CA – CL = LTL+ C – LTA

In addition to the equation above, in case there is any change in the working capital, the result will then be equal to all the sum of the changes on either side of the equation used to compute it. That is, ∆WC (change in working capital) = ∆LTL (long term liabilities) + ∆C (change in capital) – ∆LTA (change in long term assets).

2. Compute the working capital for Apex

In computing the working capital of a given company, say Apex, one looks at the components on the balance sheet since all the components of the equation above appear on the balance sheet (Aswath, 2007).

3. Describe the trend and show whether the trend is improving, deteriorating, or moderating.

Form the balance sheet; it is evident to say that, among the current assets, the inventory takes the largest amount thus determines the gross working capital. In 2012 the inventory hiked from $6500 to $ 12100. This measures the average ability of Apex Company to manage its short-term obligations.

4. Demonstrate the need for external capital.

As a result therefore, it can be deduced from the balance sheet that, the trend of the working capital is moderating as the current assets and liabilities from either years averages to the same as the other.

References

Aswath, D. Return on Capital (ROC), Return on Invested Capital (ROIC) and Return on Equity (ROE): Measurement and Implications. Stern School of Business Article, July 2007

McLeary, F. (2000). Accounting and its Business Environment. USA: Juta and Company Ltd.

http://books.google.co.ke/books?id=Xt2lLSi_qgIC&dq=explain+what+working+capital+is+and+provide+an+equation+that+can+be+used+to+compute+it.&source=gbs_navlinks_s

Shweta, M. Working Capital Trends and Liquidity Analysis of Fmcg Sector in India. IOSR Journal of Business and Management (IOSR-JBM) e-ISSN: 2278-487X, p-ISSN: 2319-7668. Volume 9, Issue 4 (Mar. -Apr. 2013), PP 45-52

HYPERLINK “http://www.iosrjournals.org/iosr-jbm/papers/Vol9-issue4/F0944552.pdf”http://www.iosrjournals.org/iosr-jbm/papers/Vol9-issue4/F0944552.pdf

Quanititive Analysis

Name

Institution

Date

Hypothesis testing gives a basis for taking thoughts or theories that an individual initially build up about the economy or empowering or markets, as well as then making a decision whether these thoughts are true or sometimes when they are false. Quantitative analysis in simple terms is defined as a scientific advance to management decision building. Quantitative analysis comprises of five steps that are utilized in its advance. The step comprises of :

the null hypothesis as well as the alternate hypothesis.

the suitable test statistic as well as level of significance.

the decision rules and regulation.

Compute the suitable test statistic as well as making the decision.

Interpreting the decision selected.

When a distinction is statistically important, it means that the difference is maybe not due to likelihood. This does not tell effectively if the distinction is meaningful or minor! Effect sizes give one a quantitative way to charge to what a degree a significant distinction may also be substantively vital(Terrell, 2012).

Z-test as well as t-test are essentially the similar; they evaluate between two means to recommend whether all samples come from the similar population. There are conversely variations on the topic for the t-test. If one has a sample and wish to contrast it with a recognized mean (for example, national average) the sole or single sample t-test is accessible. If all of the samples are not independent of every other as well as have some issue in common, for instance geographical site or before/after management, the balancing sample t-test can be useful. There are as well two variation or distinctions on the two sample t-test, the first variation uses the samples which have no equal variances as well as the second employ samples whose variation or distinctions are equal(Gravetter, & Wallnau, 2000).

A statistical test on which the critical region happens to be of a sharing is one-sided so that it is either superior than or less than an assured value, however not both. If it samples that is being tested drop into the one-sided serious area, the alternative or substitute hypothesis will be established instead of using the null hypothesis. Therefore, the one-tailed test obtains its name from testing the region beneath one of the tails or sides of the normal sharing, even though the test can be employed in other non-normal sharing as well.

On the other hand, a statistical test on which the critical region of a sharing is two sided as well as tests whether a trial is either larger or superior than or less than a firmed range of standards. If the sample that is to be tested go down into either of the critical region, the alternative hypothesis will be established as a substitute of the null hypothesis.

In conclusion, the labels ” a left-tailed, ” as well as ” right-tailed test ” refer to the usual normal sharing (as well as all of the t-sharing). The key term for identifying a left-tailed test are “greater than and less than”(Terrell, 2012).

References

HYPERLINK “http://www.bibme.org/” o “Edit this item”Gravetter, F. J., & Wallnau, L. B. (2000). Statistics for the behavioral sciences (5th ed., Instructor’s ed.). Australia: Wadsworth/Thomson Learning.

Terrell, S. R. (2012). Statistics translated: a step-by-step guide to analyzing and interpreting data. New York: Guilford Press.

Explain What Problems in Meeting Social Science Reliability and Validity Tests a Researcher May Have who uses the AHP Procedu

Name:

Institution:

Course:

Tutor:

Date:

Explain What Problems in Meeting Social Science Reliability and Validity Tests a Researcher May Have who uses the AHP Procedures

In a world that is characterized by imperfection and chaos, the use of sound, valid and consistent measurement plays an integral role in restoring order in research. Employing ineffective methods makes it difficult for researchers to accredit their arguments and arrive at meaningful conclusions. This undermines decision making and assumption of timely interventions in resolving the innumerable problems. In other words, ensuring that the measurements that are employed for analysis of social problems are reliable and valid contributes significantly to attainment of precision during the implementation of decisions. This is a sustainable measure that plays an important role in addressing the multifaceted social problems. This paper review the problems in meeting social science reliability and validity tests that a researcher who uses the AHP procedures can have.

According to Clapper, a simple analysis methodology is important in enhancing the credibility of the results. This is because essential follow up can be undertaken to review the process and make amendment accordingly. Although the AHP procedure is methodological, the possibility of an increase in the number of levels and pair-wise decisions make the process to be very complex. This triggers subjectivity that can compromise the credibility of the results. Essentially, Howell ascertains that social science researchers often employ various comparisons during analysis. Usually, this culminates in a large number of the comparison tables. The inherent complexity has far reaching implications on the decisions making process as it compromises consistency in decision making. Besides triggering subjectivity, Baker speculates that this influences the elimination of valid comparison attributes in a bit to enhance management of calculations. This limitation has made the social scientists to devise different software that aid in the managing of the respective data.

Further, Haken posits that the procedure is also liable to human psychology. In particular, researchers employing this method have exhibited the tendency to observe improvement of builds. This has made them to subconsciously inflate rankings pertaining to recent builds. In such cases, the final metrics upon which decision making is based is not always reflective of the input data. Perhaps the role of the inconsistency index impacts more on the aspects of validity and reliability of data. Lockwood and Lockwood indicate that the inconsistency index acts as a standardization measure. It needs to be above the 0.10 threshold value that is acceptable. Getting values that are lower than expected results in modification of the values of the crucial “attribute ranking vector”. This is done based on the stipulated inconsistency index. In such cases, this impacts adversely on the ultimate values. Considering that the results are employed in critical decision making, it undermines precision of the approaches that are adopted.

The PRS group asserts that in order to maximize output, there is dire need to ensure that the decision making processes is objective in nature. This is attained through the weighting process. In light of the above limitation, it is unlikely that researchers would base their decisions on credible information. The inherent modification in an attempt to ensure that the values that re employed conform to certain standards eliminates objectivity from the process. This undermines the credibility of the overall results. Although the information is got from multiple sources, its evaluation and weighting is faulty and compromises the overall quality of the procedure. Thus in order to curb this, social scientists need to be cautious especially with respect to analysis of data. For a process that requires the results to conform to certain values, this is an inherent problem that social science researchers using the AHP procedures continue to grapple with.

Bibliography

Baker, Pauline. “Conflict Resolution and Recovery Program”. Accessed 9th September, 2010 at HYPERLINK “http://www.fundforpeace.org/programs/cpr/cpr.php” t “_blank” http://www.fundforpeace.org/programs/cpr/cpr.php

Clapper James. “Review the Intelligence Threat Assessment Material”. Accessed 9th September 2010 at HYPERLINK “http://www.fas.org/irp/threat/general.htm” t “_blank” http://www.fas.org/irp/threat/general.htm.

Haken, Nate. “The Fund for Peace”. Accessed 9th September, 2010 at HYPERLINK “http://www.fundforpeace.org/web/index.php?option=com_content&task=view&id=324&Itemid=489” t “_blank” http://www.fundforpeace.org/web/index.php?option=com_content&task=view&id=324&Itemid=489

Howell, Llewellyn (ed.). The Handbook of Country and Political Risk Analysis 3rd ed East Syracuse, NY: The PRS Group, Inc., 2001.

Lockwood, Jonathan and Lockwood, Kathleen. The Lockwood Analytical Method for Prediction (LAMP). Washington DC: Joint Military Intelligence College, 1994.

The PRS Group. “Political Risk Service Methodology”, The PRS Group, Accessed 9th September 2010 at HYPERLINK “http://www.prsgroup.com/PRS_Methodology.aspx” t “_blank” http://www.prsgroup.com/PRS_Methodology.aspx.

The Quality Portal. “Review the Analytic Hierarchy”. Accessed 9th September, 2010 at HYPERLINK “http://thequalityportal.com/q_ahp.htm” t “_blank” http://thequalityportal.com/q_ahp.htm.