sentence1 stringlengths 1 133k | sentence2 stringlengths 1 131k |
|---|---|
after China's Great Leap Forward (1958–1960), the Ch'ŏllima Movement organized the labour force into work teams and brigades to compete at increasing production. The campaign was aimed at industrial and agricultural workers and at organizations in education, science, sanitation and health, and culture. In addition to work teams, units eligible for Ch'ŏllima citations included entire factories, factory workshops, and such self-contained units as a ship or a railroad station. The "socialist emulation" among the industrial sectors, enterprises, farms, and work teams under the Ch'ŏllima Movement frantically sought to complete the First Five-Year Plan (1957–1960) but instead created chaotic disruptions in the economy. This made it necessary to set aside 1959 as a "buffer year" to restore balance in the economy. Although the Ch'ŏllima Movement was replaced in the early 1960s by the Ch'ŏngsan-ni Method and the Taean Work System, the regime's reliance on mass campaigns continued into the early 1990s. Campaigns conducted after the Ch'ŏllima to speed battles toward the end of a period (such as a month, a year, or an economic plan) to reach production targets to carry out the economic goals of the decade. Efforts at modernization since 1991 Following the collapse of the Soviet Union in 1991, the principal source of external support, North Korea announced in December 1993 a three-year transitional economic policy placing primary emphasis on agriculture, light industry, and foreign trade. However, lack of fertilizer, natural disasters, and poor storage and transportation practices the country fell more than a million tons per year short of grain self-sufficiency. Moreover, lack of foreign exchange to purchase spare parts and oil for electricity generation left many factories idle. The shortage of foreign exchange because of a chronic trade deficit, a large foreign debt, and dwindling foreign aid has constrained economic development. In addition, North Korea has been diverting scarce resources from developmental projects to defence; it spent more than 20% of GNP on defence toward the end of the 1980s, a proportion among the highest in the world. These negative factors, compounded by the declining efficiency of the central planning system and the failure to modernize the economy, have slowed the pace of growth since the 1960s. The demise of the socialist republics in the Soviet Union and East European countries—North Korea's traditional trade partners and benefactors—has compounded the economic difficulties in the early 1990s. Economically, the collapse of the Soviet Union and the end of Soviet support to North Korean industries caused a contraction of the North Korea's economy by 25% during the 1990s. While, by some accounts, North Korea had a higher per capita income than South Korea in the 1970s, by 2006 its per capita income was estimated to be only $1108, one seventeenth that of South Korea. Experimentation in small scale entrepreneurship took place from 2009 to 2013, and although there continue to be legal uncertainties this has developed into a significant sector. By 2016 economic liberalisation had progressed to the extent that both locally-responsible and state industrial enterprises gave the state 20% to 50% of their output, selling the remainder to buy raw materials with market-based prices in akin to a free market. In 2014, the Enterprise Act was amended to allow state-owned enterprise managers to engage in foreign trade and joint ventures, and to accept investment from non-government domestic sources. Under the new rules the enterprise director became more like the western chief executive officer, and the chief engineer had an operational role more like a western chief operating officer. As of 2017 it was unclear if the Taean Work System (described above) still in practice operated to give local people's committees much influence. In 2017, Dr. Mitsuhiro Mimura, Senior Research Fellow at Japan's Economic Research Institute for Northeast Asia, who has visited North Korea 45 times, described it as the "poorest advanced economy in the world", in that while having comparatively low GDP, it had built a sophisticated production environment. He described the recent rise of entrepreneurial groups through "socialist cooperation", where groups of individuals could start small enterprises as cooperative groups. Managers in state-owned industries or farms were also free to sell or trade production beyond state plan targets, providing incentives to increase production. Managers could also find investment for expansion of successful operations, in a process he called "socialist competition". A state plan was still the basis for production, but was more realistic leaving room for excess production. Budget and finance The state budget is a major government instrument in carrying out the country's economic goals. Expenditures represented about three-quarters of GNP in the mid-1980s, the allocation of which reflected the priorities assigned to different economic sectors. Taxes were abolished in 1974 as "remnants of an antiquated society". This action, however, was not expected to have any significant effect on state revenue because the overwhelming proportion of government funds—an average of 98.1% during 1961–1970—was from turnover (sales) taxes, deductions from profits paid by state enterprises, and various user fees on machinery and equipment, irrigation facilities, television sets, and water. In order to provide a certain degree of local autonomy as well as to lessen the financial burden of the central government, a "local budget system" was introduced in 1973. Under this system, provincial authorities are responsible for the operating costs of institutions and enterprises not under direct central government control, such as schools, hospitals, shops, and local consumer goods production. In return, they are expected to organize as many profitable ventures as possible and to turn over profits to the central government. Around December of every year, the state budget for the following calendar year is drafted, subject to revision around March. Typically, total revenue exceeds expenditure by a small margin, with the surplus carried over to the following year. The largest share of state expenditures goes to the "people's economy", which averaged 67.3% of total expenditures between 1987 and 1990, followed in magnitude by "socio-cultural", "defense", and "administration". Defense spending, as a share of total expenditures, has increased significantly since the 1960s: from 3.7% in 1959 to 19% in 1960, and, after averaging 19.8% between 1961 and 1966, to 30.4% in 1967. After remaining around 30% until 1971, the defense share decreased abruptly to 17% in 1972, and continued to decline throughout the 1980s. Officially, in both 1989 and 1990 the defense share remained at 12%, and for 1991 it was 12.3% with 11.6% planned for 1992. The declining trend was consistent with the government's announced intentions to stimulate economic development and increase the social benefits. However, Western experts have estimated that actual military expenditures are higher than budget figures indicate. In the 1999 budget, expenditures for the farming and power sectors were increased by 15% and 11%, respectively, compared with those of 1998. In the 2007 budget, it was estimated an increase in revenue at 433.2bn won ($3.072bn, $1 = 141 won). In 2006, 5.9% were considered the public revenue, whereas this year, this figure was raised to 7.1%. North Korea claims that it is the only state in the world that does not levy taxes. Taxes were abolished beginning on April 1, 1974. Bonds Since 2003, North Korean authorities issue government bonds called The "People's Life Bonds", and promoted the slogan "Buying bonds is patriotic". North Korea sold bonds internationally in the late 1970s for 680 million Deutsche marks and 455 million Swiss francs. North Korea defaulted on these bonds by 1984, although the bonds remain traded internationally on speculation that the country would eventually perform on the obligations. The latest trends The Sydney Morning Herald reported that Kim's previous propaganda was changed into patriotism and economy, and in improving the relationship between China, South Korea, and the United States. The state-run television promoted a song of praise to the National flag by airing videos with images that included the flag being raised September 2018, during mass games events, marking North Korea's 70th anniversary. In the video, brief images of troops, fighter jets releasing blue, red, and white smoke, scattered pictures of civilians, new high-rise apartments in the capital, fireworks displays, and even students in their school uniforms can all be seen at the same event. The South China Morning Post, in a 2019 article, stated that already there is also some economical and cultural revolution happening recently within North Korea itself. It started in earnest in February 2018, during the Pyeongchang Winter Olympic Games, when top musicians from North Korea were sent to perform in South Korea. This included a female quintet who performed in black shorts and red tops. After two months, Supreme Leader Kim Jong-un saw the performance of South Korean girl group, Red Velvet. This is the first ever K-Pop show to be held in Pyongyang. The North Korean musicians that performed in South Korea were highly praised for their performance that leader Kim decided to send them to Beijing for another goodwill tour in January, 2019. Part of the revolution was the introduction of other cultures, including Western, which was previously believed to be vulgar and quite corrupt in the past, but is now slowly making its way to the North Korean people. Second-hand Harry Potter books can now be read at the National Library, and Bollywood films like the Three Idiots had just had a run in their cinemas. The changes have also found their way to the economic sector with factories that are also producing products that are associated more with the West, like Air Jordan shoes, for national consumption. Per the amendments made to the Constitution in 2019, the former economic methods of management, Ch'ŏngsan-ni in agriculture and Taean in the industries, were now phased out altogether. After the 8th Party Congress, Kim Byung-yeon of the Seoul National University said that, between 2017 and 2019, the North Korean GDP decreased by 10% and, in 2020, the North Korean GDP decreased by 5%. Economic sectors Manufacturing North Korea also implements planned economy in industry. The government will provide fuel and materials for a factory, while the factory will manufacture products and quantities according to the government's requirements. North Korea's self-reliant development strategy assigned top priority to developing heavy industry, with parallel development in agriculture and light industry. This policy was achieved mainly by giving heavy industry preferential allocation of state investment funds. More than 50% of state investment went to the industrial sector during the 1954–1976 period (47.6%, 51.3%, 57.0%, and 49.0%, respectively, during the Three-Year Plan, Five-Year Plan, First Seven-Year Plan, and Six-Year Plan). As a result, gross industrial output grew rapidly. As was the case with the growth in national output, the pace of growth has slowed markedly since the 1960s. The rate declined from 41.7% and 36.6% a year during the Three-Year Plan and Five-Year Plan, respectively, to 12.8%, 16.3%, and 12.2%, respectively, during the First Seven Year Plan, Six-Year Plan, and Second Seven-Year Plan. As a result of faster growth in industry, that sector's share in total national output increased from 16.8% in 1946 to 57.3% in 1970. Since the 1970s, industry's share in national output has remained relatively stable. From all indications, the pace of industrialization during the Third Seven-Year Plan up to 1991 is far below the planned rate of 9.6%. In 1990 it was estimated that the industrial sector's share of national output was 56%. Industry's share of the combined total of gross agricultural and industrial output climbed from 28% in 1946 to well over 90% in 1980. Heavy industry received more than 80% of the total state investment in industry between 1954 and 1976 (81.1%, 82.6%, 80%, and 83%, respectively, during the Three-Year Plan, Five-Year Plan, First Seven-Year Plan, and Six-Year Plan), and was overwhelmingly favored over light industry. North Korea claims to have fulfilled the Second Seven-Year Plan (1978–1984) target of raising the industrial output in 1984 to 120% of the 1977 target, equivalent to an average annual growth rate of 12.2%. Judging from the production of major commodities that form the greater part of industrial output, however, it is unlikely that this happened. For example, the increase during the 1978–1984 plan period for electric power, coal, steel, metal-cutting machines, tractors, passenger cars, chemical fertilizers, chemical fibers, cement, and textiles, respectively, was 78%, 50%, 85%, 67%, 50%, 20%, 56%, 80%, 78%, and 45%. Raw materials were in short supply and so were energy and hard currency. Infrastructure decayed and machinery became obsolete. Unlike other socialist countries in the Eastern Europe, North Korea kept planning in a highly centralized manner and refused to liberalize economic management. In the mid-1980s, the speculation that North Korea would emulate China in establishing Chinese-style special economic zones was flatly denied by then deputy chairman of the Economic Policy Commission Yun Ki-pok (Yun became chairman as of June 1989). China's special economic zones typically are coastal areas established to promote economic development and the introduction of advanced technology through foreign investment. Investors are offered preferential tax terms and facilities. The zones, which allow greater reliance on market forces, have more decision-making power in economic activities than do provincial-level units. Over the years, China has tried to convince the North Korean leadership of the advantages of these zones by giving tours of the various zones and explaining their values to visiting high-level officials. In April 1982, Kim Il-sung announced a new economic policy giving priority to increased agricultural production through land reclamation, development of the country's infrastructure—especially power plants and transportation facilities—and reliance on domestically produced equipment. There also was more emphasis on trade. In September 1984, North Korea promulgated a joint venture law to attract foreign capital and technology. The new emphasis on expanding trade and acquiring technology was not, however, accompanied by a shift in priorities away from support of the military. In 1991, North Korea announced the creation of a Special Economic Zone (SEZ) in the northeast regions of Rason (Rason Special Economic Zone) and Ch'ŏngjin. Investment in this SEZ has been slow in coming. Problems with infrastructure, bureaucracy, uncertainties about the security of investments, and viability have hindered growth and development. Nevertheless, thousands of small Chinese businesses had set up profitable operations in North Korea by 2011. A government research center, the Korea Computer Center, was set up in 1990, starting the slow development of an information technology industry. In 2013 and 2014, the State Economic Development Administration announced a number of smaller special economic zones covering export handling, mineral processing, high technology, gaming and tourism. Garment industry The most successful export industry is the garment industry. Production is by a North Korean firm for a European or other foreign partner, by a Chinese firm operating in North Korea with a North Korean partner, or by North Korean workers working in Chinese or other foreign factories. Wages are the lowest in northeastern Asia. Automotive industry The North Korean motor vehicle production establishes military, industrial and construction goals, with private car ownership by citizens remaining on low demand. Having Soviet origins (the subsequent practice of cloning foreign specimens, and a recent automobile joint-venture), North Korea has developed a wide-range automotive industry with production of all types of vehicles. The basis for production is in urban and off-road minis; luxury cars; SUV cars; small, medium, heavy, and super-heavy cargo; construction and off-road trucks; minibuses/minivans, coach buses, civilian and articulated buses, trolleybuses, and trams. However, North Korea produces far fewer vehicles than its production capability due to the ongoing economic crisis. North Korea has not joined or collaborated with the OICA, or with any other automotive organization, so any critical information about its motor vehicle industry is limited. Power and energy The energy sector is one of the most serious bottlenecks in the North Korean economy. Since 1990, the supply of oil, coal, and electricity declined steadily, and seriously affected all sectors of the economy. Crude oil was formerly imported by pipeline at "friendship prices" from the former Soviet Union or China, but the withdrawal of Russian concessions and the reduction of imports from China brought down annual imports from about in 1988 to less than by 1997. As the imported oil was refined for fuels for transportation and agricultural machinery, a serious cutback in oil imports caused critical problems in transportation and agriculture. According to statistics compiled by the South Korean agency Statistics Korea based on International Energy Agency (IEA) data, per capita electricity consumption fell from its peak in 1990 of 1247 kilowatt hours to a low of 712 kilowatt hours in 2000. It slowly rose since then to 819 kilowatt hours in 2008, a level below that of 1970. North Korea has no coking coal, but has substantial reserves of anthracite in Anju, Aoji (Ŭndŏk), and other areas. Coal production peaked at 43 million tons in 1989 and steadily declined to 18.6 million tons in 1998. Major causes of coal shortages include mine flooding, and outdated mining technology. As coal was used mainly for industry and electricity generation, decrease in coal production caused serious problems in industrial production and electricity generation. Coal production may not necessarily increase significantly until North Korea imports modern mining technology. Electricity generation of North Korea peaked in 1989 at about 30 TWh. There were seven large hydroelectric plants in the 1980s. Four were along the Yalu River, built with Chinese aid, and supplying power to both countries. In 1989, 60% of electricity generation was hydroelectric and 40% fossil fueled, mostly coal-fired. In 1997, coal accounted for more than 80% of primary energy consumption and hydro power more than 10%. Net imports of coal represented only about 3% of coal consumption. Hydroelectric power plants generated about 55% of North Korea's electricity and coal-fired thermal plants about 39% in 1997. However, with only 20% of the per capita electricity generation of Japan, North Korea suffered from chronic supply shortages. Coal exports to China currently account for a major portion of North Korea's revenue. Some hydroelectric facilities were believed to be out of operation due to damage from major flooding in 1995. Coal-fired power plants were running well under capacity, due in part to a serious decline in coal supply and in part to problems with transportation of coal. The electricity supply steadily declined and was 17 TWh in 1998. Transmission losses sit at just under 16% (15.8%) as of 2014. Construction Construction has been an active sector in North Korea. This was demonstrated not only through large housing programmes, of which most were visible in the high-rise apartment blocks in Pyongyang, but also in the smaller modern apartment complexes widespread even in the countryside. These are dwarfed in every sense by "grand monumental edifices". The same may apply even to apparently economically useful projects such as the Nampo Dam, which cost US$4bn. The years of economic contraction in the 1990s slowed this sector as it did others; the shell of the 105-story Ryugyŏng Hotel towered unfinished on Pyongyang's skyline for over a decade. The Bank of Korea claims that construction's share of GDP fell by almost one-third between 1992 and 1994, from 9.1% to 6.3%. This accords with a rare official figure of 6% for 1993, when the sector was said to have employed 4.2% of the labour force. However, the latter figure excludes the Korean People's Army, which visibly does much of the country's construction work. Since about 2012, when 18 tower blocks were built in Pyongyang, a construction boom has taken place in Pyongyang. Major projects include the Mansudae People's Theatre (2012), Munsu Water Park (2013), the modernisation of Pyongyang Sunan International Airport (2015) and the Science and Technology Center (2015). Banking The Central Bank of North Korea, under the Ministry of Finance, has a network of 227 local branches. Several reissues of banknotes in recent years suggest that citizens are inclined to hoard rather than bank any savings that they make from their incomes; reportedly they now also prefer foreign currency. At least two foreign aid agencies have recently set up microcredit schemes, lending to farmers and small businesses. In late 2009, North Korea revalued its currency, effectively confiscating all privately held money above the equivalent of US$35 per person. The revaluation effectively wiped out the savings of many North Koreans. Days after the revaluation the won dropped 96% against the United States dollar. Pak Nam-gi, the director of the Planning and Finance Department of North Korea's ruling Workers' Party, was blamed for the disaster and later executed in 2010. In 2004 and 2006 laws were passed to codify rules for savings and commercial banking. However it was not until 2012 that North Korean banks started to seriously compete for retail customers. Competing electronic cash cards have become widely accepted in Pyongyang and other cities, but are generally not linked to bank accounts. North Korean banks have introduced retail products which permit a mobile phone app to make payments and top-ups. As of May 2013, the Chinese banks, China Merchants Bank, Industrial and Commercial Bank of China, China Construction Bank, and Agricultural Bank of China, stopped "all cross-border cash transfers, regardless of the nature of the business" with North Korea. The Bank of China, the China's primary institution for foreign exchange transactions, said, on May 14, 2013, that "it had closed the account of Foreign Trade Bank, North Korea's main foreign exchange bank". However, "smaller banks based in northeastern China across the border from North Korea said it was still handling large-scale cross-border transfers." For example, the Bank of Dalian branch in Dandong was still doing transfers to North Korea. Kim Jong Un from 2015 has sought to enlarge and reform the banking sector, to assist economic growth, with credit cards viewed as a way to increase money circulation. The concept of socialist commercial banks is being developed to utilize "idle funds" effectively, including the savings of individuals, to promote economic growth. Retail Until the early 2000s the official retail sector was mainly state-controlled, under the direction of the People's Services Committee. Consumer goods were few and of poor quality, with most provided on a ration basis. There were state-run stores and direct factory outlets for the masses, and special shops with luxuries for the elite—as well as a chain of hard-currency stores (a joint venture with the association of pro-Pyongyang Korean residents in Japan, the Ch'ongryŏn), with branches in large cities. In 2002 and in 2010, private markets were progressively legalized, mostly for food sales. As of 2013, urban and farmer markets were held every 10 days, and most urban residents lived within 2 km of a market. In 2012, the third large shopping mall in Pyongyang, the Kwangbok Area Shopping Center, opened. In 2014 the construction of another large shopping mall started. As of 2017, these malls sold competing brands of goods, for example at least ten different kinds of toothpaste were being sold. In 2017, the Korea Institute for National Unification estimated there were 440 government-approved markets employing about 1.1 million people. Food Agriculture North Korea's sparse agricultural resources limit agricultural production. Climate, terrain, and soil conditions are not particularly favorable for farming, with a relatively short cropping season. Only about 17% of the total landmass, or approximately , is arable, of which is well suited for cereal cultivation; the major portion of the country is rugged mountain terrain. The weather varies markedly according to elevation, and lack of precipitation, along with infertile soil, makes land at elevations higher than 400 meters unsuitable for purposes other than grazing. Precipitation is geographically and seasonally irregular, and in most parts of the country as much as half the annual rainfall occurs in the three summer months. This pattern favors the cultivation of paddy rice in warmer regions that are outfitted with irrigation and flood control networks. Rice yields are 5.3 tonnes per hectare, close to international norms. In 2005, North Korea was ranked by the FAO as an estimated 10th in the production of fresh fruit and as an estimated 19th in the production of apples. Farming is concentrated in the flatlands of the four west coast provinces, where a longer growing season, level land, adequate rainfall, and good irrigated soil permit the most intensive cultivation of crops. A narrow strip of similarly fertile land runs through the eastern seaboard Hamgyŏng provinces and Kangwŏn Province, but the interior provinces of Chagang and Ryanggang are too mountainous, cold, and dry to allow much farming. The mountains contain the bulk of North Korea's forest reserves while the foothills within and between the major agricultural regions provide lands for livestock grazing and fruit tree cultivation. Since self-sufficiency remains an important pillar of North Korean ideology, self-sufficiency in food production is deemed a worthy goal. Another aim of government policies—to reduce the gap between urban and rural living standards—requires continued investment in the agricultural sector. The stability of the country depends on steady, if not rapid, increases in the availability of food items at reasonable prices. In the early 1990s, there were severe food shortages. The most far-reaching statement on agricultural policy is embodied in Kim Il-sung's 1964 Theses on the Socialist Agrarian Question in Our Country, which underscores the government's concern for agricultural development. Kim emphasized technological and educational progress in the countryside as well as collective forms of ownership and management. As industrialization progressed, the share of agriculture, forestry, and fisheries in the total national output declined from 63.5% and 31.4%, respectively, in 1945 and 1946, to a low of 26.8% in 1990. Their share in the labor force also declined from 57.6% in 1960 to 34.4% in 1989. In the 1990s, the decreasing ability to carry out mechanized operations (including the pumping of water for irrigation), as well as lack of chemical inputs, was clearly contributing to reduced yields and increased harvesting and post-harvest losses. Incremental improvements in agricultural production have been made since the late 1990s, bringing North Korea close to self-sufficiency in staple foods by 2013. In particular, rice yields have steadily improved, though yields on other crops have generally not improved. The production of protein foods remains inadequate. Access to chemical fertilizer has declined, but the use of compost and other organic fertilizer has been encouraged. Fisheries North Korean fisheries export seafood, primarily crab, to Dandong, Liaoning, illicitly. Crabs, clams and conches from the Yellow Sea waters of North Korea are popular in China, possibly because the less salty water improves taste. Food distribution system Since the 1950s, a majority of North Koreans have received | (PDS). The PDS requires farmers in agricultural regions to hand over a portion of their production to the government and then reallocates the surplus to urban regions, which cannot grow their own foods. About 70% of the North Korean population, including the entire urban population, receives food through this government-run system. Before the floods, recipients were generally allotted 600–700 grams per day while high officials, military men, heavy laborers, and public security personnel were allotted slightly larger portions of 700–800 grams per day. As of 2013, the target average distribution was 573 grams of cereal equivalent per person per day, but varied according to age, occupation, and whether rations are received elsewhere (such as school meals). However, as of 2019, this number has been reduced to 312 grams per day according to an investigation conducted by the United Nations between March 29 and April 12. Decreases in production affected the quantity of food available through the public distribution system. Shortages were compounded when the North Korean government imposed further restrictions on collective farmers. When farmers, who had never been covered by the PDS, were mandated by the government to reduce their own food allotments from 167 kilograms to 107 kilograms of grain per person each year, they responded by withholding portions of the required amount of grain. By 2008, the system had significantly recovered, and, from 2009 to 2013, daily per person rations averaged at 400 grams per day for much of the year, though in 2011 it dropped to 200 grams per day from May to September. It is estimated that in the early 2000s, the average North Korean family drew some 80% of its income from small businesses that were technically illegal (though unenforced) in North Korea. In 2002 and in 2010, private markets were progressively legalized. As of 2013, urban and farmer markets were held every 10 days, and most urban residents lived within 2 km of a market, with markets having an increasing role in obtaining food. Crisis and famine From 1994 to 1998, North Korea suffered a famine. Since North Korea is a closed country, the number of specific deaths in the incident is difficult to know. According to different literature, the starved or malnourished death toll is estimated to be between 240,000 and 480,000. Since 1998 there has been a gradual recovery in agriculture production, which by 2013 brought North Korea back close to self-sufficiency in staple foods. However, as of 2013, most households have borderline or poor food consumption, and consumption of protein remains inadequate. In the 1990s, the North Korean economy saw stagnation turning into crisis. Economic assistance received from the Soviet Union and China was an important factor of its economic growth. Upon its collapse in 1991, the Soviet Union withdrew its support and demanded payment in hard currency for imports. China stepped in to provide some assistance and supplied food and oil, most of it reportedly at concessionary prices. The North Korean economy was undermined and its industrial output began to decline in 1990. Deprived of industrial inputs, including fertilizers, pesticides, and electricity for irrigation, agricultural output also started to decrease even before North Korea had a series of natural disasters in the mid-1990s. This evolution, combined with a series of natural disasters including record floods in 1995, caused one of the worst economic crises in North Korea's history. Other causes of this crisis were high defense spending (about 25% of GDP) and bad governance. In December 1991, North Korea established a "zone of free economy and trade" to include the northeastern port cities of Unggi (Sŏnbong), Ch'ŏngjin, and Najin. The establishment of this zone also had ramifications on the questions of how far North Korea would go in opening its economy to the West and to South Korea, the future of the development scheme for the Tumen River area, and, more important, how much North Korea would reform its economic system. North Korea announced in December 1993 a three-year transitional economic policy placing primary emphasis on agriculture, light industry, and foreign trade. However, lack of fertilizer, natural disasters, and poor storage and transportation practices have left the country more than a million tons per year short of grain self-sufficiency. Moreover, lack of foreign exchange to purchase spare parts and oil for electricity generation left many factories idle. The 1990s famine paralyzed many of the Stalinist economic institutions. The government pursued Kim Jong-il's Songun policy, under which the military is deployed to direct production and infrastructure projects. As a consequence of the government's policy of establishing economic self-sufficiency, the North Korean economy has become increasingly isolated from that of the rest of the world, and its industrial development and structure do not reflect its international competitiveness. Domestic firms are shielded from international as well as domestic competition; the result is chronic inefficiency, poor quality, limited product diversity, and underutilization of plants. This protectionism also limits the size of the market for North Korean producers, which prevents taking advantage of economies of scale. Food shortages The food shortage was primarily precipitated by the loss of fuel and other raw materials imports from China and the Soviet Union which had been essential to support an energy intensive and energy inefficient farming system. Following the collapse of the Soviet Union, the former concessional trade relationships which benefited the North Korea were not available. The three flood and drought years between 1994 and 1996 only served to complete the collapse of the agriculture sector. In 2004, more than half (57%) of the population did not have enough food to stay healthy. 37% of children had their growth stunted and of mothers severely lacked nutrition. In 2006, the World Food Program (WFP) and FAO estimated a requirement of 5.3 to 6.5 million tons of grain when domestic production fulfilled only 3.825 million tons. The country also faces land degradation after forests stripped for agriculture resulted in soil erosion. In 2008, a decade after the worst years of the famine, total production was 3.34 million tons (grain equivalent) compared with a need of 5.98 million tons. Thirty seven percent of the population was deemed to be insecure in food access. Weather continued to pose challenges every year, but overall food production grew gradually, and by 2013, production had increased to the highest level since the crisis, to 5.03 million tons cereal equivalent, against a minimum requirement of 5.37 MMT. In 2014, North Korea had an exceptionally good harvest, 5.08 million tonnes of cereal equivalent, almost sufficient to feed the entire population. While food production had recovered significantly since the hardest years of 1996 and 1997, the recovery was fragile, subject to adverse weather and year to year economic shortages. Distribution was uneven, with the Public Distribution System being largely ineffective. North Korea now has in most years lower malnutrition levels than in some richer Asian countries. However, in 2019 North Korea had the worst harvest in over a decade, which the United Nations described as a "hunger crisis". Mining According to a 2012 report by South Korea-based North Korea Resource Institute (NKRI), North Korea has substantial reserves of iron ore, coal, limestone, and magnesite. In addition, North Korea is thought to have tremendous potential rare metal resources, which have been valued in excess of US$6 trillion. It is the world's 18th largest producer of iron and zinc, and has the 22nd largest coal reserves in the world. It is also the 15th largest fluorite producer and 12th largest producer of copper and salt in Asia. Other major natural resources in production include lead, tungsten, graphite, magnesite, gold, pyrites, fluorspar, and hydropower. In 2015, North Korea exported 19.7 million tonnes of coal, worth $1.06 billion, much of it to China. In 2016 it was estimated that coal shipments to China accounted for about 40% of exports. However, starting from February 2017 China suspended all North Korean coal imports, although according to China overall trade with North Korea increased. Information technology and culture North Korea has a developing information technology industry. In 2018, a technological exhibition unveiled a new Wi-Fi service called Mirae ("Future"), which allowed mobile devices to access the intranet network in Pyongyang. The exhibition also showcased a home automation system using speech recognition in Korean. North Korea's cartoon animation studios such as SEK Studio sub-contract work from South Korean animation studios. Mansudae Overseas Projects builds monuments around the world. Organization and management North Korea's economy has been unique in its elimination of markets. By the 1960s, market elements had been suppressed almost completely. Almost all items, from food to clothes, have traditionally been handed out through a public distribution system, with money only having a symbolic meaning. Ratios of food depend on hierarchy in the system, wherein the positions seem to be semi-hereditary. Until the late 1980s, peasants were not allowed to cultivate private garden plots. Since the government is the dominant force in the development and management of the economy, bureaus and departments have proliferated at all administrative levels. There are fifteen committees—such as the agricultural and state planning committees—one bureau, and twenty departments under the supervision of the Cabinet; of these, twelve committees—one bureau, and sixteen departments are involved in economic management. In the early 1990s, several vice premiers of the then State Administration Council supervised economic affairs. Organizations undergo frequent reorganization. Many of these agencies have their own separate branches at lower levels of government while others maintain control over subordinate sections in provincial and county administrative agencies. Around 1990, with the collapse of the Soviet Union, restrictions on private sales, including grain, ceased to be enforced. It is estimated that in the early 2000s, the average North Korean family drew some 80% of its income from small businesses that were technically illegal (though unenforced) in North Korea. In 2002, and in 2010, private markets were progressively legalized. As of 2013, urban and farmer markets were held every 10 days, and most urban residents lived within 2 km of a market. In 2014, North Korea announced the "May 30th measures". These planned to give more freedom to farmers, allowing them to keep 60% of their produce. Also enterprise managers would be allowed to hire and fire workers, and decide whom they do business with and where they buy raw materials and spare parts. Some reports suggest that these measures would allow nominally state-run enterprises to be run on capitalist lines like those in China. Economic planning North Korea, one of the world's most centrally planned and isolated economies, faces desperate economic conditions. Industrial capital stock is nearly beyond repair as a result of years of underinvestment and shortages of spare parts. Industrial and power output have declined in parallel. During what North Korea called the "peaceful construction" period before the Korean War, the fundamental task of the economy was to overtake the level of output and efficiency attained toward the end of the Japanese occupation; to restructure and develop a viable economy reoriented toward the communist-bloc countries; and to begin the process of socializing the economy. Nationalization of key industrial enterprises and land reform, both of which were carried out in 1946, laid the groundwork for two successive one-year plans in 1947 and 1948, respectively, and the Two-Year Plan of 1949–50. It was during this period that the piece-rate wage system and the independent accounting system began to be applied and that the commercial network increasingly came under state and cooperative ownership. The basic goal of the Three-Year Plan, officially named "The Three-Year Post-war Reconstruction Plan of 1954–56", was to reconstruct an economy torn by the Korean War. The plan stressed more than merely regaining the prewar output levels. The Soviet Union, other East European countries and China provided reconstruction assistance. The highest priority was developing heavy industry, but an earnest effort to collectivize farming also was begun. At the end of 1957, output of most industrial commodities had returned to 1949 levels, except for a few items such as chemical fertilizers, carbides, and sulfuric acid, whose recovery took longer. Having basically completed the task of reconstruction, the state planned to lay a solid foundation for industrialization while completing the socialization process and solving the basic problems of food and shelter during the Five-Year Plan of 1957–1960. The socialization process was completed by 1958 in all sectors of the economy, and the Ch'ŏllima Movement was introduced. Although growth rates reportedly were high, there were serious imbalances among the different economic sectors. Because rewards were given to individuals and enterprises that met production quotas, frantic efforts to fulfill plan targets in competition with other enterprises and industries caused disproportionate growth among various enterprises, between industry and agriculture and between light and heavy industries. Because resources were limited and the transportation system suffered bottlenecks, resources were diverted to politically well-connected enterprises or those whose managers complained the loudest. An enterprise or industry that performed better than others often did so at the expense of others. Such disruptions intensified as the target year of the plan approached. Until the 1960s, North Korea's economy grew much faster than South Korea's. Although North Korea was behind in total national output, it was ahead of South Korea in per capita national output, because of its smaller population relative to South Korea. For example, in 1960 North Korea's population was slightly over 10 million people, while South Korea's population was almost 25 million people. Annual economic growth rates of 30% and 21% during the Three-Year Plan of 1954–1956 and the Five-Year Plan of 1957–1960, respectively, were reported. After claiming early fulfillment of the Five-Year Plan in 1959, North Korea officially designated 1960 a "buffer year"—a year of adjustment to restore balances among sectors before the next plan became effective in 1961. Not surprisingly the same phenomenon recurred in subsequent plans. Because the Five-Year Plan was fulfilled early, it became a de facto four-year plan. Beginning in the early 1960s, however, North Korea's economic growth slowed until it was stagnant at the beginning of the 1990s. Various factors explain the very high rate of economic development of the country in the 1950s and the general slowdown since the 1960s. During the reconstruction period after the Korean War, there were opportunities for extensive economic growth—attainable through the communist regime's ability to marshall idle resources and labor and to impose a low rate of consumption. This general pattern of initially high growth resulting in a high rate of capital formation was mirrored in other Soviet-type economies. Toward the end of the 1950s, as reconstruction work was completed and idle capacity began to diminish, the economy had to shift from the extensive to the intensive stage, where the simple communist discipline of marshaling underutilized resources became less effective. In the new stage, inefficiency arising from emerging bottlenecks led to diminishing returns. Further growth would only be attained by increasing efficiency and technological progress. Beginning in the early 1960s, a series of serious bottlenecks began to impede development. Bottlenecks were pervasive and generally were created by the lack of arable land, skilled labor, energy, and transportation, and deficiencies in the extractive industries. Moreover, both land and marine transportation lacked modern equipment and modes of transportation. The inability of the energy and extractive industries as well as of the transportation network to supply power and raw materials as rapidly as the manufacturing plants could absorb them began to slow industrial growth. The First Seven-Year Plan (initially 1961–1967) built on the groundwork of the earlier plans but changed the focus of industrialization. Heavy industry, with the machine tool industry as its linchpin, was given continuing priority. During the plan, however, the economy experienced widespread slowdowns and reverses for the first time, in sharp contrast to the rapid and uninterrupted growth during previous plans. Disappointing performance forced the planners to extend the plan three more years, until 1970. During the last part of the de facto ten-year plan, emphasis shifted to pursuing parallel development of the economy and of defense capabilities. This shift was prompted by concern over the military takeover in South Korea by General Park Chung-hee (1961–1979), escalation of the United States involvement in Vietnam, and the widening Sino-Soviet split. It was thought that stimulating a technological revolution in the munitions industry was one means to achieve these parallel goals. In the end, the necessity to divert resources to defense became the official explanation for the plan's failure. The Six-Year Plan of 1971–1976 followed immediately in 1971. In the aftermath of the poor performance of the preceding plan, growth targets of the Six-Year Plan were scaled down substantially. Because some of the proposed targets in the First Seven-Year Plan had not been attained even by 1970, the Six-Year Plan did not deviate much from its predecessor in basic goals. The Six-Year Plan placed more emphasis on technological advance, self-sufficiency (Juche) in industrial raw materials, improving product quality, correcting imbalances among different sectors, and developing the power and extractive industries; the last of these had been deemed largely responsible for slowdowns during the First Seven-Year Plan. The plan called for attaining a self- sufficiency rate of 60–70% in all industrial sectors by substituting domestic raw materials wherever possible and by organizing and renovating technical processes to make the substitution feasible. Improving transport capacity was seen as one of the urgent tasks in accelerating economic development—it was one of the major bottlenecks of the Six-Year Plan. North Korea claimed to have fulfilled the Six-Year Plan by the end of August 1975, a full year and four months ahead of schedule. Under the circumstances, it was expected that the next plan would start without delay in 1976, a year early, as was the case when the First Seven-Year Plan was instituted in 1961. Even if the Six-Year Plan had been completed on schedule, the next plan should have started in 1977. However, it was not until nearly two years and four months later that the long-awaited plan was unveiled—1977 had become a "buffer year". The inability of the planners to continuously formulate and institute economic plans reveals as much about the inefficacy of planning itself as the extent of the economic difficulties and administrative disruptions facing the country. For example, targets for successive plans have to be based on the accomplishments of preceding plans. If these targets are underfulfilled, all targets of the next plan—initially based on satisfaction of the plan—have to be reformulated and adjusted. Aside from underfulfillment of the targets, widespread disruptions and imbalances among various sectors of the economy further complicate plan formulation. The basic thrust of the Second Seven-Year Plan (1978–1984) was to achieve the three-pronged goals of self-reliance, modernization, and "scientification". Although the emphasis on self-reliance was not new, it had not previously been the explicit focus of an economic plan. This new emphasis might have been a reaction to mounting foreign debt originating from large-scale imports of Western machinery and equipment in the mid-1970s. Through modernization North Korea hoped to increase mechanization and automation in all sectors of the economy. "Scientification" means the adoption of up-to-date production and management techniques. The specific objectives of the economic plan were to strengthen the fuel, energy, and resource bases of industry through priority development of the energy and extractive industries; to modernize industry; to substitute domestic resources for certain imported raw materials; to expand freight-carrying capacity in railroad, road, and marine transportation systems; to centralize and containerize the transportation system; and to accelerate a technical revolution in agriculture. In order to meet the manpower and technology requirements of an expanding economy, the education sector also was targeted for improvements. The quality of the comprehensive eleven-year compulsory education system was to be enhanced to train more technicians and specialists, and to expand the training of specialists, particularly in the fields of fuel, mechanical, electronic, and automation engineering. Successful fulfillment of the so-called nature-remaking projects also was part of the Second Seven-Year Plan. These projects referred to the five-point program for nature transformation unveiled by Kim Il-sung in 1976: completing the irrigation of non-paddy fields; reclaiming 1,000 square kilometres of new land; building 1,500 to 2,000 km of terraced fields; carrying out afforestation and water conservation work; and reclaiming tidal land. From all indications, the Second Seven-Year Plan was not successful. North Korea generally downplayed the accomplishments of the plan, and no other plan received less official fanfare. It was officially claimed that the economy had grown at an annual rate of 8.8% during the plan, somewhat below the planned rate of 9.6%. The reliability of this aggregate measure, however, is questionable. During the plan, the target annual output of 10 million tons of grains (cereals and pulses) was attained. However, by official admission, the targets of only five other commodities were fulfilled. Judging from the growth rates announced for some twelve industrial products, it is highly unlikely that the total industrial output increased at an average rate of 12.2% as claimed. After the plan concluded, there was no new economic plan for two years, indications of both the plan's failure and the severity of the economic and planning problems confronting the economy in the mid-1980s. From 1998 to 2003, the government implemented a plan for scientific and technical development, which focused on the nation's IT and electronic industry. Corruption In 2019, North Korea was ranked 172nd in the Transparency International Corruption Perceptions Index with a score of 17 out of 100. Labor Growth and changes in the structure and ownership pattern of the economy also have changed the labor force. By 1958 individual private farmers, who once constituted more than 70% of the labor force, had been transformed into or replaced by state or collective farmers. Private artisans, merchants, and entrepreneurs had joined state or cooperative enterprises. In the industrial sector in 1963, the last year for which such data are available, there were 2,295 state enterprises and 642 cooperative enterprises. The size and importance of the state enterprises can be surmised by the fact that state enterprises, which constituted 78% of the total number of industrial enterprises, contributed 91% of total industrial output. Labor force (12.6 million)—by occupation: Agricultural: 35% Industry and services: 65% (2008 est.) External trade Statistics from North Korea's trade partners is collected by international organizations like the United Nations and the International Monetary Fund, and by the South Korean Ministry of Unification. It has also been estimated that imports of arms from the Soviet Union in the period 1988 to 1990 accounted for around 30% of the North Korea's total imports, and that between 1981 and 1989 North Korea earned approximately $4 billion from the export of arms, approximately 30% of North Korea's total exports in that period. The nominal dollar value of arms exports from North Korea in 1996 was estimated to have been around $50 million. North Korea's foreign trade deteriorated in the 1990s. After hitting the bottom of $1.4 billion in 1998, it recovered slightly. North Korea's trade total in 2002 was $2.7 billion: only about 50% of $5.2 billion in 1988, even in nominal US dollars. These figures exclude intra-Korean trade, deemed internal, which rose in 2002 to $641 million. During the late 2000s trade grew strongly, almost tripling between 2007 and 2011 to $5.6 billion, with much of the growth being with China. By about 2010 external trade had returned to 1990 levels, and by 2014 was near double 1990 levels, with trade with China increasing from 50% of total trade in 2005 to near 90% in 2014. In 2015, it was estimated that exports to China were $2.3 billion—83% of total exports of $2.83 billion. In addition to Kaesŏng and Kŭmgang-san, other special economic areas were established at Sinŭiju in the northwest (on the border with China), and at Rasŏn in the northeast (on the border with China and Russia). International sanctions impeded international trade to some degree, many related to North Korea's development of weapons of mass destruction. United States President Barack Obama approved an executive order in April 2011 that declared "the importation into the United States, directly or indirectly, of any goods, services, or technology from North Korea is prohibited". Operational sanctions included United Nations Security Council Resolutions 1695, 1718, 1874, 1928, 2087, and 2094. Reports in 2018 indicated that trade sanctions (bans on almost all exports and the freezing of overseas accounts) were seriously affecting the economy. The main paper Rodong Sinmun was running short of paper and was publishing only a third of its normal print run, two energy plants supplying electricity to Pyongyang had to be shut down intermittently due to lack of coal, causing blackouts, coal mines were operating under capacity due to lack of fuel, coal could not be transported due to lack of fuel and food rations had been cut by half. The Taep'oong International Investment Group of Korea is the official company that manages oversea investments to North Korea. After 1956, North Korea reached out to the Third World in the hope of making trade deals. However, according to analyst Benjamin R Young: "In the end, this approach proved ineffective, and Pyongyang never succeeded in developing robust trade relations with the Global South — a situation that appears unlikely to change anytime soon". North–South economic ties North and South Korea's economic ties have fluctuated greatly over the past 30 years or so. In the late 1990s and most of the 2000s, north–south relations warmed under the Sunshine Policy of President Kim Dae-jung. Many firms agreed to invest in North Korea, encouraged by the South Korean government's commitment to cover their losses, should investment projects in |
May 2010, more than 120,000 North Koreans owned mobile phones; this number had increased to 301,000 by September 2010, 660,000 by August 2011, and 900,000 by December 2011. Orascom reported 432,000 North Korean subscribers after two years of operation (December 2010), increasing to 809,000 by September 2011, and exceeding one million by February 2012. By April 2013 subscriber numbers neared two million. By 2015 the figure had grown to three million. In 2011, 60% of Pyongyang's citizens between the age of 20 and 50 had a cellphone. On June 15, 2011, StatCounter.com confirmed that some North Koreans use Apple's iPhones, as well as Nokia's and Samsung's smartphones. In November 2020, no mobile phones could dial into or out of the country, and there was no Internet connection. A 3G network covered 94 percent of the population, but only 14 percent of the territory. Koryolink has no international roaming agreements. Pre-paid SIM cards can be purchased by visitors to North Korea to make international (but not domestic) calls. Prior to January 2013, foreigners had to surrender their phones at the border crossing or airport before entering the country, but with the availability of local SIM cards this policy is no longer in place. Internet access, however, is only available to resident foreigners and not tourists. North Korean mobile phones use a digital signature system to prevent access to unsanctioned files, and log usage information that can be physically inspected. A survey in 2017 found that 69% of households had a mobile phone. In September 2019 a previously unknown company Kwangya Trading Company (광야무역회사의) announced the release of a cell phone for North Korean consumer use called the Kimtongmu. Although state-run media reports that the phone was developed by North Korean outlets it is likely sourced rather from a Chinese OEM and outfitted with North Korean software. International connection North Korea has had a varying number of connections to other nations. Currently, international fixed line connections consist of a network connecting Pyongyang to Beijing and Moscow, and Chongjin to Vladivostok. Communications were opened with South Korea in 2000. On May 2006 TransTeleCom Company and North Korea's Ministry of Communications have signed an agreement for the construction and joint operation of a fiber-optic transmission line in the section of the Khasan–Tumangang railway checkpoint in the North Korea-Russia border. This is the first direct land link between Russia and North Korea. TTC's partner in the design, construction, and connection of the communication line from the Korean side to the junction was Korea Communication Company of North Korea's Ministry of Communications. The technology transfer was built around STM-1 level digital equipment with the possibility of further increasing bandwidth. The construction was completed in 2007. Since joining Intersputnik in 1984, North Korea has operated 22 lines of frequency-division multiplexing and 10 lines of single channel per carrier for communication with Eastern Europe. and in late 1989 international direct dialing service through microwave link was introduced from Hong Kong. A satellite ground station near Pyongyang provides direct international communications using the International Telecommunications Satellite Corporation (Intelsat) Indian Ocean satellite. A satellite communications center was installed in Pyongyang in 1986 with French technical support. An agreement to share in Japan's telecommunications satellites was reached in 1990. North Korea joined the Universal Postal Union in 1974 but has direct postal arrangements with only a select group of countries. Fiber optic lines Following the agreement with UNDP, the Pyongyang Fiber Optic Cable Factory was built in April 1992 and the country's first optical fiber cable network consisting of 480 pulse-code modulation (PCM) lines and 6 automatic exchange stations from Pyongyang to Hamhung (300 kilometers) was installed in September 1995. Moreover, the nationwide land leveling and rezoning campaign initiated by Kim Jong-il in Kangwon province in May 1998 and in North Pyongan province in January 2000 facilitated the construction of provincial and county fiber optic lines, which were laid by tens of thousands of Korean People's Army (KPA) soldier-builders and provincial shock brigade members mobilized for the large-scale public works projects designed to rehabilitate the hundreds of thousands of hectares of arable lands devastated by the natural disasters in the late 1990s. Television Broadcasting in North Korea is tightly controlled by the state and is used as a propaganda arm of the ruling Korean Workers' Party. The Korean Central Television station is located in Pyongyang, and there also are stations in major cities, including Chŏngjin, Kaesŏng, Hamhŭng, Haeju, and Sinŭiju. There are three channels in Pyongyang but only one channel in other cities. Imported Japanese-made color televisions have a North Korean brand name superimposed, but nineteen-inch black-and-white sets have been produced locally since 1980. One estimate placed the total number of television sets in use in the early 1990s at 250,000 sets. A study in 2017 found that 98% of households had a TV set. Radio Visitors are not allowed to bring a radio. As part of the government's information blockade policy, North Korean radios and televisions must be modified to receive only | telecommunications satellites was reached in 1990. North Korea joined the Universal Postal Union in 1974 but has direct postal arrangements with only a select group of countries. Fiber optic lines Following the agreement with UNDP, the Pyongyang Fiber Optic Cable Factory was built in April 1992 and the country's first optical fiber cable network consisting of 480 pulse-code modulation (PCM) lines and 6 automatic exchange stations from Pyongyang to Hamhung (300 kilometers) was installed in September 1995. Moreover, the nationwide land leveling and rezoning campaign initiated by Kim Jong-il in Kangwon province in May 1998 and in North Pyongan province in January 2000 facilitated the construction of provincial and county fiber optic lines, which were laid by tens of thousands of Korean People's Army (KPA) soldier-builders and provincial shock brigade members mobilized for the large-scale public works projects designed to rehabilitate the hundreds of thousands of hectares of arable lands devastated by the natural disasters in the late 1990s. Television Broadcasting in North Korea is tightly controlled by the state and is used as a propaganda arm of the ruling Korean Workers' Party. The Korean Central Television station is located in Pyongyang, and there also are stations in major cities, including Chŏngjin, Kaesŏng, Hamhŭng, Haeju, and Sinŭiju. There are three channels in Pyongyang but only one channel in other cities. Imported Japanese-made color televisions have a North Korean brand name superimposed, but nineteen-inch black-and-white sets have been produced locally since 1980. One estimate placed the total number of television sets in use in the early 1990s at 250,000 sets. A study in 2017 found that 98% of households had a TV set. Radio Visitors are not allowed to bring a radio. As part of the government's information blockade policy, North Korean radios and televisions must be modified to receive only government stations. These modified radios and televisions should be registered at special state department. They are also subject to inspection at random. The removal of the official seal is punishable by law. In order to buy a TV-set or a radio, North Korean citizens are required to get special permission from officials at their places of residence or employment. North Korea has two AM radio broadcasting networks, (Voice of Korea) and Korean Central Broadcasting Station, and one FM network, . All three networks have stations in major cities that offer local programming. There also is a powerful shortwave transmitter for overseas broadcasts in several languages. The official government station is the Korean Central Broadcasting Station (KCBS), which broadcasts in Korean. In 1997 there were 3.36 million radio sets. Internet National area network Kwangmyong is a North Korean "walled garden" national intranet opened in 2000. It is accessible from within North Korea's major cities, counties, as well as universities and major industrial and commercial organizations. Kwangmyong has 24-hour unlimited access by dial-up telephone line. A survey in 2017 found that 19% of households had a computer, but that only 1% nationally and 5% in Pyongyang had access to the internet. In August 2016, it was reported that North Korea had launched a state-approved video streaming service which has been likened to Netflix. The service, known as "Manbang" (meaning everyone) uses a set-top box to stream live TV, on-demand video and newspaper articles (from the state newspaper Rodong Sinmun) over the internet. The service is only available to citizens in Pyongyang, Siniju and Sariwon. The state TV channel Korean Central Television (KCTV) described the service as a |
problems and government restrictions. Public transport predominates, and most of it is electrified. Restrictions on freedom of movement Travel to North Korea is tightly controlled. The standard route to and from North Korea is by plane or train via Beijing. Transport directly to and from South Korea was possible on a limited scale from 2003 until 2008, when a road was opened (bus tours, no private cars). Freedom of movement in North Korea is also limited, as citizens are not allowed to move around freely inside their country. On October 14, 2018, North and South Korea agreed to restore inter-Korean rail and road transportation. On November 22, 2018, North and South Korea reopened a road on the Korean border which had been closed since 2004. On November 30, 2018, inter-Korean rail transportation resumed when a South Korean train crossed into North Korea for the first time since November 2008. On December 8, 2018, a South Korean bus crossed into North Korea. Roads Fuel constraints and the near absence of private automobiles have relegated road transportation to a secondary role. The road network was estimated to be around in 1999, up from between and in 1990, of which only , 7.5%, are paved. However, The World Factbook (published by the US Central Intelligence Agency) lists of roads with only paved as of 2006. As for the road quality, drivers will often swerve and change lanes to evade potholes, and this includes going into opposite-direction lanes at times. Likewise, sections under repair may not be properly signalled, so oncoming traffic should always be expected even on a divided motorway. There are three major multilane highways: a expressway connecting Pyongyang and Wonsan on the east coast, a expressway connecting Pyongyang and its port, Nampo, and a four-lane motorway linking Pyongyang and Kaesong. The overwhelming majority of the estimated 264,000 vehicles in use in 1990 were for the military. Rural bus service connects all villages, and cities have bus and tram services. Since 1945/1946, there is right-hand traffic on roads. In cities, driving speeds are set by which lane a driver is in. The speed limits are , , and for the first, second, and subsequent (if existing) lanes from the right, respectively. A white-on-blue sign informs about this. The leftmost lane, if it is number 3 from the right or higher and is not a turning lane, is often left vacant, even by tourist buses, while the second-from-right lane is generally used to overtake vehicles from lane one, such as public transport buses and trams. Besides the blue in-city sign, all other occasions, such as motorways and roads outside cities, use the more widely known red-circle-with-number-inside sign to post speed limits. On motorways, the typical limit is and for lanes from the right, respectively, as posted on the Pyongyang-Kaesong highway, for example. The rightmost lane of a motorway is sometimes, as seen on the Pyongyang–Myohyang highway, limited to near on-ramp joining points. Automobile transportation is further restricted by a series of regulations. According to North Korean exile Kim Ji-ho, unless a driver receives a special permit it is forbidden to drive alone (the driver must carry passengers). Other permits are a military mobilization permit (to transport soldiers in times of war), a certificate of driver training (to be renewed every year), a fuel validity document (a certificate confirming that the fuel was purchased from an authorized source), and a mechanical certificate (to prove that the car is in working order). Since about 2014, horizontally-mounted traffic lights and cameras have been installed in central Pyongyang and other cities. Outside Pyongyang, roundabouts are often used on busy junctions. As of 2017, electric bicycles are becoming popular in Pyongyang; about 5% of bicycles are electric. Both locally produced and Chinese electric bicycles were available. As of 2016 there is of road which is 25% of South Korea's road system in length. Public transport There is a mix of locally built and imported trolleybuses and trams in the major urban centres of North Korea. Earlier fleets were obtained from Europe and China. For the list of trolleybus systems in North Korea, see this list. Other forms of public transport include a commuters' narrow gauge railway from Hamhung to Hungnam which links to the 2.8 Vinylon Complex. North Korea also has regularly scheduled motorcoach service connecting major cities and nearby towns to one another, which can be identified by their destination signs. For example Pyongyang-Sariwon, or Pyongyang-Wonsan. Some bus lines supplement the electric transportation in Pyongyang, as seen in a 1989 map that was likely obtained during the 13th World Festival of Youth and Students. Some routes are still identifiable, such as the route 10, which now has a destination of Sadong-Daedongmun, and has its own stop on Okryu street. Some parts have changed much more drastically, like the southwest | 10, which now has a destination of Sadong-Daedongmun, and has its own stop on Okryu street. Some parts have changed much more drastically, like the southwest of Pyongyang, which has seen a lot of new construction. One thing that makes tracing the routes difficult is that all kinds of transportation vehicles in North Korea rarely show a route number, opting for a destination sign instead. Some buses may be used for non-regularly scheduled service, but are indistinguishable because all the buses are state owned and can be used for a variety of purposes. Railways The Korean State Railway is the only rail operator in North Korea. It has a network of over of standard gauge and of narrow gauge () lines; as of 2007, over of the standard gauge (well over 80%), along with of the narrow gauge lines are electrified. The narrow gauge segment runs in the Haeju peninsula. Because of lack of maintenance on the rail infrastructure and vehicles, the travel time by rail is increasing. It has been reported that the trip from Pyongyang to Kaesong can take up to six hours. Water transport Water transport on the major rivers and along the coasts plays a growing role in freight and passenger traffic. Except for the Yalu and Taedong rivers, most of the inland waterways, totaling , are navigable only by small boats. Coastal traffic is heaviest on the eastern seaboard, whose deeper waters can accommodate larger vessels. The major ports are Nampo on the west coast and Rajin, Chongjin, Wonsan, and Hamhung on the east coast. The country's harbor loading capacity in the 1990s was estimated at almost 35 million tons a year. There is a continuing investment in upgrading and expanding port facilities, developing transportation—particularly on the Taedong River—and increasing the share of international cargo by domestic vessels. Merchant marine In the early 1990s, North Korea possessed an oceangoing merchant fleet, largely domestically produced, of 68 ships (of at least 1,000 gross-registered tons), totalling 465,801 gross-registered tons (), which included 58 cargo ships and two tankers. As of 2008, this has increased to a total of 167 vessels consisting mainly of cargo and tanker ships. Ferry Service North Korea maintains the Man Gyong Bong 92, a ferry connecting Rajin and Vladivostok, Russia. Air transport North Korea's international air connections are limited in frequency and numbers. As of 2011, scheduled flights operate only from Pyongyang's Pyongyang Sunan International Airport to Beijing, Dalian, Shenyang, Shanghai, Bangkok, Kuala Lumpur, Singapore, Moscow, Khabarovsk, Vladivostok, and Kuwait International Airport. Charters to other destinations operate as per demand. Prior to 1995 many routes to Eastern Europe were operated including services to Sofia, Belgrade, Prague, and Budapest, along with others. Air Koryo is the country's national airline. , Air China also operates flights between Beijing and Pyongyang. In 2013, MIAT Mongolian Airlines began operating direct charter services from Ulaanbattar to Pyongyang with Boeing 737-800 aircraft. Internal flights are available between Pyongyang, Hamhung, Haeju (HAE), Hungnam (HGM), Kaesong (KSN), Kanggye, Kilju, Najin (NJN), Nampo (NAM), Sinuiju (SII), Samjiyon, Wonsan (WON), Songjin (SON), and Chongjin (CHO). All civil aircraft are operated by Air Koryo, which has a fleet of 19 passenger and cargo aircraft, all of which are Soviet or more modern Russian types. As of 2013, the CIA estimates that North Korea has 82 usable airports, 39 of which have permanent-surface runways. It was reported that North Korean air traffic controllers had been cut off from the international global satellite communications network in 2017 because North Korea had not made the required payments. Traffic controllers at Pyongyang Sunan International Airport had to use conventional telephone lines to inform their counterparts at Incheon International Airport that the flight containing North Korean delegates to the 2018 Winter Olympic Games in South Korea had taken off. Vehicle markings Road vehicles in North Korea bear distance stars. These are paint markings which display how far the particular vehicle has traveled without incident. Each star represents travelled without an accident. The bus in this example has three stars, indicating that |
Korea may possess fissile material for around two to nine nuclear warheads. The North Korean Songun ("Military First") policy elevates the KPA to the primary position in the government and society. According to North Korea's state news agency, military expenditures for 2010 made up 15.8 percent of the state budget. Most analyses of North Korea's defence sector, however, estimate that defence spending constitutes between one-quarter and one-third of all government spending. , according to the International Institute of Strategic Studies, North Korea's defence budget consumed some 25 percent of central government spending. In the mid-1970s and early 1980s, according to figures released by the Polish Arms Control and Disarmament Agency, between 32 and 38 percent of central government expenditures went towards defence. North Korea sells missiles and military equipment to many countries worldwide. In April 2009, the United Nations named the Korea Mining and Development Trading Corporation (KOMID) as North Korea's primary arms dealer and main exporter of equipment related to ballistic missiles and conventional weapons. It also named Korea Ryonbong as a supporter of North Korea's military related sales. Historically, North Korea has assisted a vast number of revolutionary, insurgent and terrorist groups in more than 62 countries. A cumulative total of more than 5,000 foreign personnel have been trained in North Korea, and over 7,000 military advisers, primarily from the Reconnaissance General Bureau, have been dispatched to some forty-seven countries. Some of the organisations which received North Korean aid include the Polisario Front, Janatha Vimukthi Peramuna, the Communist Party of Thailand, the Palestine Liberation Organization and the Islamic Revolutionary Guard Corps. The Zimbabwean Fifth Brigade received its initial training from KPA instructors. North Korean troops allegedly saw combat during the Libyan–Egyptian War and the Angolan Civil War. Up to 200 KPAF pilots took part in the Vietnam War, scoring several kills against U.S. aircraft. Two KPA anti-aircraft artillery regiments were sent to North Vietnam as well. North Korean instructors trained Hezbollah fighters in guerrilla warfare tactics around 2004, prior to the Second Lebanon War. During the Syrian Civil War, Arabic-speaking KPA officers may have assisted the Syrian Arab Army in military operations planning and have supervised artillery bombardments in the Aleppo area. Service branches Ground Force The Korean People's Army Ground Force (KPAGF) is the main branch of the Korean People's Army responsible for land-based military operations. It is the de facto army of North Korea. Naval Force The Korean People's Army Naval Force (KPANF) is organized into two fleets (West Fleet and East Fleet, the latter being the larger of the two) which, owing to the limited range and general disrepair of their vessels, are not able to support each other, let alone meet for joint operations. The East Fleet is headquartered at T'oejo-dong and the West Fleet at Nampho. A number of training, shipbuilding and maintenance units and a naval air wing report directly to Naval Command Headquarters at Pyongyang. Air and Anti-Air Force The Korean People's Army Air and Anti-Air Force (KPAAF) is also responsible for North Korea's air defence forces through the use of anti-aircraft artillery and surface-to-air missiles (SAM). While much of the equipment is outdated, the high saturation of multilayered, overlapping, mutually supporting air defence sites provides a formidable challenge to enemy air attacks. Strategic Rocket Force The Korean People's Army Strategic Rocket Force (KPASRF) is a major division of the KPA that controls North Korea's nuclear and conventional strategic missiles. It is mainly equipped with surface-to-surface missiles of Soviet and Chinese design, as well as locally developed long-range missiles. Special Operation Force The Korean People's Army Special Operation Force (KPASOF) is an asymmetric force with a total troop size of 200,000. Since the Korean War, it has continued to play a role of concentrating infiltration of troops into the territory of South Korea and conducting sabotage. Capabilities After the Korean War, North Korea maintained a powerful, but smaller military force than that of South Korea. In 1967 the KPA forces of about 345,000 were much smaller than the South Korean ground forces of about 585,000. North Korea's relative isolation and economic plight starting from the 1980s has now tipped the balance of military power into the hands of the better-equipped South Korean military. In response to this predicament, North Korea relies on asymmetric warfare techniques and unconventional weaponry to achieve parity against high-tech enemy forces. North Korea is reported to have developed a wide range of technologies towards this end, such as stealth paint to conceal ground targets, midget submarines and human torpedoes, blinding laser weapons, and probably has a chemical weapons program and is likely to possess a stockpile of chemical weapons. The Korean People's Army operates ZM-87 anti-personnel lasers, which are banned under the United Nations Protocol on Blinding Laser Weapons. Since the 1980s, North Korea has also been actively developing its own cyber warfare capabilities. , the secretive Bureau 121 – the elite North Korean cyber warfare unit – comprises approximately 1,800 highly trained hackers. In December 2014, the Bureau was accused of hacking Sony Pictures and making threats, leading to the cancellation of The Interview, a political satire comedy film based on the assassination of Kim Jong-un. The Korean People's Army has also made advances in electronic warfare by developing GPS jammers. Current models include vehicle-mounted jammers with a range of -. Jammers with a range of more than 100 km are being developed, along with electromagnetic pulse bombs. The Korean People's Army has also made attempts to jam South Korean military satellites. North Korea does not have satellites capable of obtaining satellite imagery useful for military purposes, and appears to use imagery from foreign commercial platforms. Despite the general fuel and ammunition shortages for training, it is estimated that the wartime strategic reserves of food for the army are sufficient to feed the regular troops for 500 days, while fuel and ammunition – amounting to 1.5 million and 1.7 million tonnes respectively – are sufficient to wage a full-scale war for 100 days. The KPA does not operate aircraft carriers, but has other means of power projection. Korean People's Air Force Il-76MD aircraft provide a strategic airlift capacity of 6,000 troops, while the Navy's sea lift capacity amounts to 15,000 troops. The Strategic Rocket Forces operate more than 1,000 ballistic missiles according to South Korean officials in 2010, although the U.S. Department of Defense reported in 2012 that North Korea has fewer than 200 missile launchers. North Korea acquired 12 Foxtrot class and Golf-II class missile submarines as scrap in 1993. Some analysts suggest that these have either been refurbished with the help of Russian experts or their launch tubes have been reverse-engineered and externally fitted to regular submarines or cargo ships. However GlobalSecurity reports that the submarines were rust-eaten hulks with the launch tubes inactivated under Russian observation before delivery, and the U.S. Department of Defense does not list them as active. A photograph of Kim Jong-un receiving a briefing from his top generals on 29 March 2013 showed a list that purported to show that the military had a minimum of 40 submarines, 13 landing ships, 6 minesweepers, 27 support vessels and 1,852 aircraft. The Korean People's Army operates a very large amount of equipment, including 4,100 tanks, 2,100 APCs, 8,500 field artillery pieces, 5,100 multiple rocket launchers, 11,000 air defence guns and some 10,000 MANPADS and anti-tank guided missiles in the Ground force; about 500 vessels in the Navy and 730 combat aircraft in the Air Force, of which 478 are fighters and 180 are bombers. North Korea also has the largest special forces in the world, as well as the largest submarine fleet. The equipment is a mixture of World War II vintage vehicles and small arms, widely proliferated Cold War technology, and more modern Soviet or locally produced weapons. North Korea possesses a vast array of long range artillery in shelters just north of the Korean Demilitarized Zone. It has been a long-standing cause for concern that a preemptive strike or retaliatory strike on Seoul using this arsenal of artillery north of the Demilitarized Zone would lead to a massive loss of life in Seoul. Estimates on how many people would die in an attack on Seoul vary. When the Clinton administration mobilised forces over the reactor at Yongbyon in 1994, planners concluded that retaliation by North Korea against Seoul could kill 40,000 people. Other estimates projects hundreds of thousands or possibly millions of fatalities if North Korea uses chemical munitions. Military equipment Weapons The KPA possess a variety of Chinese and Soviet sourced equipment and weaponry, as well as locally produced versions | the Central Military Commission. The KPA consists of five branches: the Ground Force, the Naval Force, the Air and Anti-Air Force, the Strategic Rocket Forces, and the Special Operation Force. The KPA considers its primary adversaries to be the Republic of Korea Armed Forces and United States Forces Korea, across the Korean Demilitarized Zone, as it has since the Armistice Agreement of July 1953. it is the second largest military organisation in the world, with of the North Korean population actively serving, in reserve or in a paramilitary capacity. History Korean People's Revolutionary Army 1932–1948 Kim Il-sung's anti-Japanese guerrilla army, the , was established on 25 April 1932. This revolutionary army was transformed into the regular army on 8 February 1948. Both these are celebrated as army days, with decennial anniversaries treated as major celebrations, except from 1978 to 2014 when only the 1932 anniversary was celebrated. Korean Volunteer Army 1939–1948 In 1939, the Korean Volunteer Army (KVA), was formed in Yan'an, China. The two individuals responsible for the army were Kim Tu-bong and Mu Chong. At the same time, a school was established near Yan'an for training military and political leaders for a future independent Korea. By 1945, the KVA had grown to approximately 1,000 men, mostly Korean deserters from the Imperial Japanese Army. During this period, the KVA fought alongside the Chinese communist forces from which it drew its arms and ammunition. After the defeat of the Japanese, the KVA accompanied the Chinese communist forces into eastern Jilin, intending to gain recruits from ethnic Koreans in China, particularly from Yanbian, and then enter Korea. Soviet Korean Units Just after World War II and during the Soviet Union's occupation of the part of Korea north of the 38th Parallel, the Soviet 25th Army headquarters in Pyongyang issued a statement ordering all armed resistance groups in the northern part of the peninsula to disband on 12 October 1945. Two thousand Koreans with previous experience in the Soviet army were sent to various locations around the country to organise constabulary forces with permission from Soviet military headquarters, and the force was created on 21 October 1945. Formation of National Army The headquarters felt a need for a separate unit for security around railways, and the formation of the unit was announced on 11 January 1946. That unit was activated on 15 August of the same year to supervise existing security forces and creation of the national armed forces. Military institutes such as the Pyongyang Academy (became No. 2 KPA Officers School in Jan. 1949) and the Central Constabulary Academy (became KPA Military Academy in Dec. 1948) soon followed for the education of political and military officers for the new armed forces. After the military was organised and facilities to educate its new recruits were constructed, the Constabulary Discipline Corps was reorganised into the Korean People's Army General Headquarters. The previously semi-official units became military regulars with the distribution of Soviet uniforms, badges, and weapons that followed the inception of the headquarters. The State Security Department, a forerunner to the Ministry of People's Defense, was created as part of the Interim People's Committee on 4 February 1948. The formal creation of the Korean People's Army was announced four days later on 8 February, the day after the Fourth Plenary Session of the People's Assembly approved the plan to separate the roles of the military and those of the police, seven months before the government of the Democratic People's Republic of Korea was proclaimed on 9 September 1948. In addition, the Ministry of State for the People's Armed Forces was established, which controlled a central guard battalion, two divisions, and an independent mixed and combined arms brigade. Conflicts and events Before the outbreak of the Korean War, Joseph Stalin equipped the KPA with modern tanks, trucks, artillery, and small arms (at the time, the South Korean Army had nothing remotely comparable either in numbers of troops or equipment). During the opening phases of the Korean War in 1950, the KPA quickly drove South Korean forces south and captured Seoul, only to lose 70,000 of their 100,000-strong army in the autumn after U.S. amphibious landings at the Battle of Incheon and a subsequent drive to the Yalu River. On 4 November, China openly staged a military intervention. On 7 December, Kim Il-sung was deprived of the right of command of KPA by China. The KPA subsequently played a secondary minor role to Chinese forces in the remainder of the conflict. By the time of the Armistice in 1953, the KPA had sustained 290,000 casualties and lost 90,000 men as POWs. In 1953, the Military Armistice Commission (MAC) was able to oversee and enforce the terms of the armistice. The Neutral Nations Supervisory Commission (NNSC), made up of delegations from Czechoslovakia, Poland, Sweden and Switzerland, carried out inspections to ensure implementation of the terms of the Armistice that prevented reinforcements or new weapons being brought into Korea. Soviet thinking on the strategic scale was replaced since December 1962 with a people's war concept. The Soviet idea of direct warfare was replaced with a Maoist war of attrition strategy. Along with the mechanisation of some infantry units, more emphasis was put on light weapons, high-angle indirect fire, night fighting, and sea denial. Date of establishment history Until 1977, original Korean People's Army's official date of establishment was 8 February 1948. But in 1978, changed to 25 April 1932, Kim Il-sung's anti-Japanese guerrilla army – Korean People's Revolutionary Army, considered the predecessor of the Korean People's Army, was formed on 25 April 1932. The date of establishment was officially changed back to 8 February 1948 by 2019, however. Organization Commission and leadership The primary path for command and control of the KPA extends through the National Defence Commission which was led by its chairman Kim Jong-il until 2011, to the Ministry of Defence and its General Staff Department. From there on, command and control flows to the various bureaus and operational units. A secondary path, to ensure political control of the military establishment, extends through the Workers' Party of Korea's Central Military Commission. Since 1990, numerous and dramatic transformations within North Korea have led to the current command and control structure. The details of the majority of these changes are simply unknown to the world. What little is known indicates that many changes were the natural result of the deaths of the aging leadership including Kim Il-sung (July 1994), Minister of People's Armed Forces O Chin-u (February 1995) and Minister of Defence Choi Kwang (February 1997). The vast majority of changes were undertaken to secure the power and position of Kim Jong-il. Formerly the State Affairs Commission (SAC), from its founding in 1972 (originally the National Defence Commission), was part of the (CPC) while the Ministry of Defence, from 1982 onward, was under direct presidential control. At the Eighteenth session of the sixth Central People's Committee, held on 23 May 1990, the SAC became established as its own independent commission, rising to the same status as the CPC (now the Cabinet of North Korea) and not subordinated to it, as was the case before. Concurrent with this, Kim Jong-il was appointed first vice-chairman of the State Affairs Commission. The following year, on 24 December 1991, Kim Jong-il was appointed Supreme Commander of the Korean People's Army. Four months later, on 20 April 1992, Kim Jong-il was awarded the rank of Marshal and his father, in virtue of being the KPA's founding commander in chief, became Grand Marshal as a result and one year later he became the Chairman of the State Affairs Commission, by now under Supreme People's Assembly control under the then 1992 constitution as amended. Almost all officers of the KPA began their military careers as privates; only very few people are admitted to a military academy without prior service. The results is an egalitarian military system where officers are familiar with the life of a military private and "military nobility" is all but nonexistent. Within the KPA, between December 1991 and December 1995, nearly 800 high officers (out of approximately 1,200) received promotions and preferential assignments. Three days after Kim Jong-il became Marshal, eight generals were appointed to the rank of Vice-Marshal. In April 1997, on the 85th anniversary of Kim Il-sung's birthday, Kim Jong-il promoted 127 general and admiral grade officers. The following April he ordered the promotions of another 22 generals and flag officers. Along with these changes, many KPA officers were appointed to influential positions within the Korean Workers' Party. These promotions continue today, simultaneous with the celebration of Kim Il-sung's birthday and the KPA anniversary celebrations every April and since recently in July to honour the end of the Korean War. Under Kim Jong-il's leadership, political officers dispatched from the party monitored every move of a general's daily life, according to analysts similar to the work of Soviet political commissars during the early and middle years of the military establishment. Today the KPA exercises full control of both the Politburo and the Central Military Commission of the WPK, the KPA General Political and General Staff Departments and the Ministry of Defence, all having KPA representatives with a minimum general officer rank. Following changes made during the 4th session of the 13th Supreme People's Assembly on 29 June 2016, the State Affairs Commission has overseen the Ministry of Defence as part of its systemic responsibilities. All members of the State Affairs Commission have membership status (regular or alternate) on the WPK Political Bureau. Ground force formations I Corps (Hoeyang County, Kangwon Province) II Corps (Pyongsan County, North Hwanghae Province) III Corps (Nampo, South Pyongan) IV Corps (Haeju, South Hwanghae Province) V Corps (Sepo County, Kangwon Province) VII Corps (Hamhung, South Hamgyong Province) Pyongyang Defense Command XII Corps IX (Chongjin, North Hamgyong Province) X Corps (Hyesan, Ryanggang Province) XI Corps (Tokchon, South Pyongan Province) Mechanised infantry divisions: 108th Division 425th Division 806th Division 815th Division 820th Tank Corps Conscription and terms of service North Korea has conscription for males for 10 years. Females are conscripted up until the age of 23. Article 86 of the North Korean Constitution states: "National defence is the supreme duty and honour of citizens. Citizens shall defend the country and serve in the armed forces as required by law." KPA soldiers serve three years of military service in the KPA, which also runs its own factories, farms and trading arms. Paramilitary organisations The Young Red Guards are the youth cadet corps of the KPA for secondary level and university level students. Every Saturday, they hold mandatory 4-hour military training drills, and have training activities on and off campus to prepare them for military service when they |
its defense seriously, confronting countries they see as threatening their sovereignty, and restricts the activities of foreign diplomats. History After 1945, the USSR supplied the economic and military aid that enabled North Korea to mount its invasion of South Korea in 1950. Soviet aid and influence continued at a high level during the Korean war. This was only the beginning of North Korea as governed by the faction which had its roots in an anti-Japanese Korean nationalist movement based in Manchuria and China, with Kim Il-sung participating in this movement and later forming the Workers' Party of Korea (WPK). The assistance of Chinese troops, after 1950, during the war and their presence in the country until 1958 gave China some degree of influence in North Korea. In 1961, North Korea concluded formal mutual security treaties with the Soviet Union and China, which have not been formally ended. In the case of China, Kim Il-sung and Chou En-Lai signed the Sino-North Korean Mutual Aid and Cooperation Friendship Treaty, whereby Communist China pledged to immediately render military and other assistance by all means to its ally against any outside attack. The treaty says, in short that: The Chairman of the People's Republic of China and the Presidium of the Supreme People's Assembly of the Democratic People's Republic of Korea, determined, in accordance with Marxism–Leninism and the principle of proletarian internationalism and on the basis of mutual respect for state sovereignty and territorial integrity, mutual non-aggression, non-interference in each other's internal affairs, equality and mutual benefit, and mutual assistance and support, to make every effort to further strengthen and develop the fraternal relations of friendship, co-operation and mutual assistance between the People's Republic of China and the Democratic People's Republic of Korea, to jointly guard the security of the two peoples, and to safeguard and consolidate the peace of Asia and the world ... [Article II:]The Contracting Parties will continue to make every effort to safeguard the peace of Asia and the world and the security of all peoples ... [Article II:] In the event of one of the Contracting Parties being subjected to the armed attack by any state or several states jointly and thus being involved in a state of war, the other Contracting Party shall immediately render military and other assistance by all means at its disposal ... [Article V:] The Contracting Parties, on the principles of mutual respect for sovereignty, non-interference in each other's internal affairs, equality and mutual benefit and in the spirit of friendly co-operation, will continue to render each other every possible economic and technical aid in the cause of socialist construction of the two countries and will continue to consolidate and develop economic, cultural, and scientific and technical co-operation between the two countries ... [Article VI:] The Contracting Parties hold that the unification of Korea must be realized along peaceful and democratic lines and that such a solution accords exactly with the national interests of the Korean people and the aim of preserving peace in the Far East. For most of the Cold War, North Korea avoided taking sides in the Sino-Soviet split. It was originally only recognized by countries in the Communist Bloc until 1958 when Algeria recognized it. East Germany was an important source of economic cooperation for North Korea. The East German leader, Erich Honecker, who visited in 1977, was one of Kim Il-sung's closest foreign friends. In 1986, the two countries signed an agreement on military co-operation. Kim was also close to maverick Communist leaders, Josip Broz Tito of Yugoslavia, and Nicolae Ceaușescu of Romania. North Korea began to play a part in the global radical movement, forging ties with such diverse groups as the Black Panther Party of the US, the Workers Party of Ireland, and the African National Congress. As it increasingly emphasized its independence, North Korea began to promote the doctrine of Juche ("self-reliance") as an alternative to orthodox Marxism-Leninism and as a model for developing countries to follow. When North-South dialogue started in 1972, North Korea began to receive diplomatic recognition from countries outside the Communist bloc. Within four years, North Korea was recognized by 93 countries, on par with South Korea's 96. North Korea gained entry into the World Health Organization and, as a result, sent its first permanent observer missions to the United Nations (UN). In 1975, it joined the Non-Aligned Movement. Libyan Leader Muammar Gaddafi met with Kim Il Sung and was a close ally of the DPRK. In 1983 North Korea carried out the Rangoon bombing, a failed assassination attempt against South Korean President Chun Doo-hwan while he was visiting Burma. This attack on neutral soil led many Third World countries to reconsider their ties with North Korea. During the 1980s, the pace of North Korea's establishment of new diplomatic relations slowed considerably. Following Kim Il-sung's 1984 visit to Moscow, there was a dramatic improvement in Soviet-DPRK relations, resulting in renewed deliveries of advanced Soviet weaponry to North Korea and increases in economic aid. In 1989, as a response to the 1988 Seoul Olympics, North Korea hosted the 13th World Festival of Youth and Students in Pyongyang. South Korea established diplomatic relations with the Soviet Union in 1990 and the People's Republic of China in 1992, which put a serious strain on relations between North Korea and its traditional allies. Moreover, the demise of Communist states in Eastern Europe in 1989 and the disintegration of the Soviet Union in 1991 had resulted in a significant drop in communist aid to North Korea, resulting in largely decreased relations with Russia. Subsequently, South Korea developed the "sunshine policy" towards North Korea, aiming for peaceful Korean reunification. This policy ended in 2009. In September 1991, North Korea became a member of the UN. In July 2000, it began participating in the ASEAN Regional Forum (ARF), as Foreign Minister Paek Nam-sun attended the ARF ministerial meeting in Bangkok July 26–27. North Korea also expanded its bilateral diplomatic ties in that year, establishing diplomatic relations with Italy, Australia and the Philippines. The United Kingdom established diplomatic relations with North Korea on December 13, 2000, as did Canada in February 2001, followed by Germany and New Zealand on March 1, 2001. In 2006, North Korea test-fired a series of ballistic missiles, after Chinese officials had advised North Korean authorities not to do so. As a result, Chinese authorities publicly rebuked what the west perceives as China's closest ally, and supported the UN Security Council Resolution 1718, which imposed sanctions on North Korea. At other times however, China has blocked United Nations resolutions threatening sanctions against North Korea. In January 2009, China's paramount leader Hu Jintao and North Korea's supreme leader Kim Jong-il exchanged greetings and declared 2009 as the "year of China-DPRK friendship", marking 60 years of diplomatic relations between the two countries. On November 28, 2010, as part of the United States diplomatic cables leak, WikiLeaks and media partners such as The Guardian published details of communications in which Chinese officials referred to North Korea as a "spoiled child" and its nuclear program as "a threat to the whole world's security" while two anonymous Chinese officials claimed there was growing support in Beijing for Korean reunification under the South's government. In 2017, North Korea tested the Hwasong-15, an intercontinental ballistic missile capable of striking anywhere in the USA. Estimates of North Korea's nuclear arsenal at that time ranged between 15 and 60 bombs, probably including hydrogen bombs. In February 2018, North Korea sent a high-level delegation to the Winter Olympics in South Korea. Subsequently Kim Jong-un met with President Moon Jae-in of South Korea and US President Donald Trump to discuss peace. Inter-Korean relations In August 1971, both North and South Korea agreed to hold talks through their respective Red Cross societies with the aim of reuniting the many Korean families separated following the division of Korea after the Korean War. After a series of secret meetings, both sides announced on July 4, 1972, an agreement to work toward peaceful reunification and an end to the hostile atmosphere prevailing on the peninsula. Dialogue was renewed on several fronts in September 1984, when South Korea accepted the North's offer to provide relief goods to victims of severe flooding in South Korea. In a major initiative in July 1988, South Korean President Roh Tae-woo called for new efforts to promote North-South exchanges, family reunification, inter-Korean trade and contact in international forums. Roh followed up this initiative in a UN General Assembly speech in which South Korea offered to discuss security matters with the North for the first time. In September 1990, the first of eight prime minister-level meetings between officials of North Korea and South Korea took place in Seoul, beginning an especially fruitful period of dialogue. The prime ministerial talks resulted in two major agreements: the Agreement on Reconciliation, Nonaggression, Exchanges, and Cooperation (the Basic Agreement) and the Declaration on the Denuclearization of the Korean Peninsula (the Joint Declaration). The Joint Declaration on denuclearization was initiated on December 13, 1991. It forbade both sides to test, manufacture, produce, receive, possess, store, deploy, or use nuclear weapons and forbade the possession of nuclear reprocessing and uranium enrichment facilities. On January 30, 1992, North Korea also signed a nuclear safeguards agreement with the IAEA, as it had pledged to do in 1985 when acceding to the nuclear Non-Proliferation Treaty. This safeguards agreement allowed IAEA inspections to begin in June 1992. As the 1990s progressed, concern over the North's nuclear program became a major issue in North-South relations and between North Korea and the US. By 1998, South Korean President Kim Dae-jung announced a Sunshine Policy towards North Korea. This led in June 2000 to the first Inter-Korean summit, between Kim Dae-jung and Kim Jong-il. In September 2000, the North and South Korean teams | Treaty, whereby Communist China pledged to immediately render military and other assistance by all means to its ally against any outside attack. The treaty says, in short that: The Chairman of the People's Republic of China and the Presidium of the Supreme People's Assembly of the Democratic People's Republic of Korea, determined, in accordance with Marxism–Leninism and the principle of proletarian internationalism and on the basis of mutual respect for state sovereignty and territorial integrity, mutual non-aggression, non-interference in each other's internal affairs, equality and mutual benefit, and mutual assistance and support, to make every effort to further strengthen and develop the fraternal relations of friendship, co-operation and mutual assistance between the People's Republic of China and the Democratic People's Republic of Korea, to jointly guard the security of the two peoples, and to safeguard and consolidate the peace of Asia and the world ... [Article II:]The Contracting Parties will continue to make every effort to safeguard the peace of Asia and the world and the security of all peoples ... [Article II:] In the event of one of the Contracting Parties being subjected to the armed attack by any state or several states jointly and thus being involved in a state of war, the other Contracting Party shall immediately render military and other assistance by all means at its disposal ... [Article V:] The Contracting Parties, on the principles of mutual respect for sovereignty, non-interference in each other's internal affairs, equality and mutual benefit and in the spirit of friendly co-operation, will continue to render each other every possible economic and technical aid in the cause of socialist construction of the two countries and will continue to consolidate and develop economic, cultural, and scientific and technical co-operation between the two countries ... [Article VI:] The Contracting Parties hold that the unification of Korea must be realized along peaceful and democratic lines and that such a solution accords exactly with the national interests of the Korean people and the aim of preserving peace in the Far East. For most of the Cold War, North Korea avoided taking sides in the Sino-Soviet split. It was originally only recognized by countries in the Communist Bloc until 1958 when Algeria recognized it. East Germany was an important source of economic cooperation for North Korea. The East German leader, Erich Honecker, who visited in 1977, was one of Kim Il-sung's closest foreign friends. In 1986, the two countries signed an agreement on military co-operation. Kim was also close to maverick Communist leaders, Josip Broz Tito of Yugoslavia, and Nicolae Ceaușescu of Romania. North Korea began to play a part in the global radical movement, forging ties with such diverse groups as the Black Panther Party of the US, the Workers Party of Ireland, and the African National Congress. As it increasingly emphasized its independence, North Korea began to promote the doctrine of Juche ("self-reliance") as an alternative to orthodox Marxism-Leninism and as a model for developing countries to follow. When North-South dialogue started in 1972, North Korea began to receive diplomatic recognition from countries outside the Communist bloc. Within four years, North Korea was recognized by 93 countries, on par with South Korea's 96. North Korea gained entry into the World Health Organization and, as a result, sent its first permanent observer missions to the United Nations (UN). In 1975, it joined the Non-Aligned Movement. Libyan Leader Muammar Gaddafi met with Kim Il Sung and was a close ally of the DPRK. In 1983 North Korea carried out the Rangoon bombing, a failed assassination attempt against South Korean President Chun Doo-hwan while he was visiting Burma. This attack on neutral soil led many Third World countries to reconsider their ties with North Korea. During the 1980s, the pace of North Korea's establishment of new diplomatic relations slowed considerably. Following Kim Il-sung's 1984 visit to Moscow, there was a dramatic improvement in Soviet-DPRK relations, resulting in renewed deliveries of advanced Soviet weaponry to North Korea and increases in economic aid. In 1989, as a response to the 1988 Seoul Olympics, North Korea hosted the 13th World Festival of Youth and Students in Pyongyang. South Korea established diplomatic relations with the Soviet Union in 1990 and the People's Republic of China in 1992, which put a serious strain on relations between North Korea and its traditional allies. Moreover, the demise of Communist states in Eastern Europe in 1989 and the disintegration of the Soviet Union in 1991 had resulted in a significant drop in communist aid to North Korea, resulting in largely decreased relations with Russia. Subsequently, South Korea developed the "sunshine policy" towards North Korea, aiming for peaceful Korean reunification. This policy ended in 2009. In September 1991, North Korea became a member of the UN. In July 2000, it began participating in the ASEAN Regional Forum (ARF), as Foreign Minister Paek Nam-sun attended the ARF ministerial meeting in Bangkok July 26–27. North Korea also expanded its bilateral diplomatic ties in that year, establishing diplomatic relations with Italy, Australia and the Philippines. The United Kingdom established diplomatic relations with North Korea on December 13, 2000, as did Canada in February 2001, followed by Germany and New Zealand on March 1, 2001. In 2006, North Korea test-fired a series of ballistic missiles, after Chinese officials had advised North Korean authorities not to do so. As a result, Chinese authorities publicly rebuked what the west perceives as China's closest ally, and supported the UN Security Council Resolution 1718, which imposed sanctions on North Korea. At other times however, China has blocked United Nations resolutions threatening sanctions against North Korea. In January 2009, China's paramount leader Hu Jintao and North Korea's supreme leader Kim Jong-il exchanged greetings and declared 2009 as the "year of China-DPRK friendship", marking 60 years of diplomatic relations between the two countries. On November 28, 2010, as part of the United States diplomatic cables leak, WikiLeaks and media partners such as The Guardian published details of communications in which Chinese officials referred to North Korea as a "spoiled child" and its nuclear program as "a threat to the whole world's security" while two anonymous Chinese officials claimed there was growing support in Beijing for Korean reunification under the South's government. In 2017, North Korea tested the Hwasong-15, an intercontinental ballistic missile capable of striking anywhere in the USA. Estimates of North Korea's nuclear arsenal at that time ranged between 15 and 60 bombs, probably including hydrogen bombs. In February 2018, North Korea sent a high-level delegation to the Winter Olympics in South Korea. Subsequently Kim Jong-un met with President Moon Jae-in of South Korea and US President Donald Trump to discuss peace. Inter-Korean relations In August 1971, both North and South Korea agreed to hold talks through their respective Red Cross societies with the aim of reuniting the many Korean families separated following the division of Korea after the Korean War. After a series of secret meetings, both sides announced on July 4, 1972, an agreement to work toward peaceful reunification and an end to the hostile atmosphere prevailing on the peninsula. Dialogue was renewed on several fronts in September 1984, when South Korea accepted the North's offer to provide relief goods to victims of severe flooding in South Korea. In a major initiative in July 1988, South Korean President Roh Tae-woo called for new efforts to promote North-South exchanges, family reunification, inter-Korean trade and contact in international forums. Roh followed up this initiative in a UN General Assembly speech in which South Korea offered to discuss security matters with the North for the first time. In September 1990, the first of eight prime minister-level meetings between officials of North Korea and South Korea took place in Seoul, beginning an especially fruitful period of dialogue. The prime ministerial talks resulted in two major agreements: the Agreement on Reconciliation, Nonaggression, Exchanges, and Cooperation (the Basic Agreement) and the Declaration on the Denuclearization of the Korean Peninsula (the Joint Declaration). The Joint Declaration on denuclearization was initiated on December 13, 1991. It forbade both sides to test, manufacture, produce, receive, possess, store, deploy, or use nuclear weapons and forbade the possession of nuclear reprocessing and uranium enrichment facilities. On January 30, 1992, North Korea also signed a nuclear safeguards agreement with the IAEA, as it had pledged to do in 1985 when acceding to the nuclear Non-Proliferation Treaty. This safeguards agreement allowed IAEA inspections to begin in June 1992. As the 1990s progressed, concern over the North's nuclear program became a major issue in North-South relations and between North Korea and the US. By 1998, South Korean President Kim Dae-jung announced a Sunshine Policy towards North Korea. This led in June 2000 to the first Inter-Korean summit, between Kim Dae-jung and Kim Jong-il. In September 2000, the North and South Korean teams marched together at the Sydney Olympics. Trade increased to the point where South Korea became North Korea's largest trading partner. Starting in 1998, the Mount Kumgang Tourist Region was developed as a joint venture between the government of North Korea and Hyundai. In 2003, the Kaesong Industrial Region was established to allow South Korean businesses to invest in the North. In 2007, South Korean President Roh Moo-hyun held talks with Kim Jong-il in Pyongyang. On October 4, 2007, South Korean President Roh and Kim signed a peace declaration. The document called for international talks to replace the Armistice which ended the Korean War with a permanent peace treaty. The Sunshine Policy was formally abandoned by subsequent South Korean President Lee Myung-bak in 2010. The Kaesong Industrial Park was closed in 2013, amid tensions about North Korea's nuclear weapons program. It reopened the same year but closed again in 2016. In 2017 Moon Jae-in was elected President of South Korea with promises to return to the Sunshine Policy. In his New Year address for 2018, North Korean leader Kim Jong-un proposed sending a delegation to the upcoming Winter Olympics in South Korea. The Seoul–Pyongyang hotline was reopened after almost two years. North and South Korea marched together in the Olympics opening ceremony and fielded a united women's ice hockey team. North Korea sent an unprecedented high-level delegation, headed by Kim Yo-jong, sister of Kim Jong-un, and President Kim Yong-nam, as well as athletes |
transport infrastructure, with most infrastructure concentrated around Greater Belfast, Greater Derry and Craigavon. Northern Ireland is served by three airports – Belfast International near Antrim, George Best Belfast City integrated into the railway network at Sydenham in East Belfast, and City of Derry in County Londonderry. Major seaports at Larne and Belfast carry passengers and freight between Great Britain and Northern Ireland. Passenger railways are operated by Northern Ireland Railways. With Iarnród Éireann (Irish Rail), Northern Ireland Railways co-operates in providing the joint Enterprise service between Dublin Connolly and Lanyon Place. The whole of Ireland has a mainline railway network with a gauge of , which is unique in Europe and has resulted in distinct rolling stock designs. The only preserved line of this gauge on the island is the Downpatrick and County Down Railway, which operates heritage steam and diesel locomotives. Main railway lines linking to and from Belfast Great Victoria Street railway station and Lanyon Place railway station are: The Derry Line and the Portrush Branch. The Larne Line The Bangor Line The Portadown Line Main motorways are: M1 connecting Belfast to the south and west, ending in Dungannon M2 connecting Belfast to the north. An unconnected section of the M2 also by-passes Ballymena Additional short motorway spurs include: M12 connecting the M1 to Portadown M22 connecting the M2 to near Randalstown M3 connecting the M1 (via the A12) and M2 in Belfast with the A2 dual carriageway to Bangor M5 connecting Belfast to Newtownabbey The cross-border road connecting the ports of Larne in Northern Ireland and Rosslare Harbour in the Republic of Ireland is being upgraded as part of an EU-funded scheme. European route E01 runs from Larne through the island of Ireland, Spain and Portugal to Seville. Demographics The population of Northern Ireland has risen yearly since 1978. The population in 2011 was 1.8 million, having grown 7.5% over the previous decade from just under 1.7 million in 2001. This constitutes just under 3% of the population of the UK (62 million) and just over 28% of the population of the island of Ireland (6.3 million). The population density is 132 inhabitants / km2. Most of the population of Northern Ireland lives concentrated in its five largest cities: Belfast (capital), Derry, Lisburn, Newtownabbey and Bangor. The population of Northern Ireland is almost entirely white (98.2%). In 2011, 88.8% of the population were born in Northern Ireland, with 4.5% born elsewhere in the United Kingdom, and 2.9% born in the Republic of Ireland. 4.3% were born elsewhere; triple the amount there were in 2001. Most are from Eastern Europe. The largest non-white ethnic groups were Chinese (6,300) and Indian (6,200). Black people of various origins made up 0.2% of the 2011 population and people of mixed ethnicity also made up 0.2%. Religion At the 2011 census, 41.5% of the population identified as Protestant/non-Roman Catholic Christian, 41% as Roman Catholic, and 0.8% as non-Christian, while 17% identified with no religion or did not state one. The biggest of the Protestant/non-Roman Catholic Christian denominations were the Presbyterian Church (19%), the Church of Ireland (14%) and the Methodist Church (3%). In terms of community background (i.e. religion or religion brought up in), 48% of the population came from a Protestant background, 45% from a Catholic background, 0.9% from non-Christian backgrounds, and 5.6% from non-religious backgrounds. Citizenship and identity In the 2011 census in Northern Ireland respondents gave their national identity as follows. Several studies and surveys carried out between 1971 and 2006 have indicated that, in general, most Protestants in Northern Ireland see themselves primarily as British, whereas a majority of Roman Catholics regard themselves primarily as Irish.Northern Ireland Life and Times Survey, 1999; Module:Community Relations, Variable:NINATID Summary:72% of Protestants replied "British". 68% of Catholics replied "Irish".Northern Ireland Life and Times Survey, 1999; Module:Community Relations, Variable:IRISH Summary: 77% of Catholics replied "Strongly Irish." This does not, however, account for the complex identities within Northern Ireland, given that many of the population regard themselves as "Ulster" or "Northern Irish", either as a primary or secondary identity. Overall, the Catholic population is somewhat more ethnically diverse than the more homogeneous Protestant population. 83.1% of Protestants identified as "British" or with a British ethnic group (English, Scottish, or Welsh) in the 2011 Census, whereas only 3.9% identified as "Irish". Meanwhile, 13.7% of Catholics identified as "British" or with a British ethnic group. A further 4.4% identified as "all other", which are largely immigrants, for example from Poland. A 2008 survey found that 57% of Protestants described themselves as British, while 32% identified as Northern Irish, 6% as Ulster and 4% as Irish. Compared to a similar survey carried out in 1998, this shows a fall in the percentage of Protestants identifying as British and Ulster and a rise in those identifying as Northern Irish. The 2008 survey found that 61% of Catholics described themselves as Irish, with 25% identifying as Northern Irish, 8% as British and 1% as Ulster. These figures were largely unchanged from the 1998 results. People born in Northern Ireland are, with some exceptions, deemed by UK law to be citizens of the United Kingdom. They are also, with similar exceptions, entitled to be citizens of Ireland. This entitlement was reaffirmed in the 1998 Good Friday Agreement between the British and Irish governments, which provides that: ...it is the birthright of all the people of Northern Ireland to identify themselves and be accepted as Irish or British, or both, as they may so choose, and accordingly [the two governments] confirm that their right to hold both British and Irish citizenship is accepted by both Governments and would not be affected by any future change in the status of Northern Ireland. As a result of the Agreement, the Constitution of the Republic of Ireland was amended. The current wording provides that people born in Northern Ireland are entitled to be Irish citizens on the same basis as people from any other part of the island. Neither government, however, extends its citizenship to all persons born in Northern Ireland. Both governments exclude some people born in Northern Ireland, in particular persons born without one parent who is a British or Irish citizen. The Irish restriction was given effect by the twenty-seventh amendment to the Irish Constitution in 2004. The position in UK nationality law is that most of those born in Northern Ireland are UK nationals, whether or not they so choose. Renunciation of British citizenship requires the payment of a fee, currently £372. In the 2011 census in Northern Ireland respondents stated that they held the following passports. Languages English is spoken as a first language by almost all of the Northern Ireland population. It is the de facto official language and the Administration of Justice (Language) Act (Ireland) 1737 prohibits the use of languages other than English in legal proceedings. Under the Good Friday Agreement, Irish and Ulster Scots (an Ulster dialect of the Scots language, sometimes known as Ullans), are recognised as "part of the cultural wealth of Northern Ireland". Two all-island bodies for the promotion of these were created under the Agreement: Foras na Gaeilge, which promotes the Irish language, and the Ulster Scots Agency, which promotes the Ulster Scots dialect and culture. These operate separately under the aegis of the North/South Language Body, which reports to the North/South Ministerial Council. The British government in 2001 ratified the European Charter for Regional or Minority Languages. Irish (in Northern Ireland) was specified under Part III of the Charter, with a range of specific undertakings in relation to education, translation of statutes, interaction with public authorities, the use of placenames, media access, support for cultural activities and other matters. A lower level of recognition was accorded to Ulster Scots, under Part II of the Charter. English The dialect of English spoken in Northern Ireland shows influence from the lowland Scots language. There are supposedly some minute differences in pronunciation between Protestants and Catholics, for instance; the name of the letter h, which Protestants tend to pronounce as "aitch", as in British English, and Catholics tend to pronounce as "haitch", as in Hiberno-English. However, geography is a much more important determinant of dialect than religious background. Irish The Irish language (), or Gaelic, is a native language of Ireland. It was spoken predominantly throughout what is now Northern Ireland before the Ulster Plantations in the 17th century and most place names in Northern Ireland are anglicised versions of a Gaelic name. Today, the language is often associated with Irish nationalism (and thus with Catholics). However, in the 19th century, the language was seen as a common heritage, with Ulster Protestants playing a leading role in the Gaelic revival. In the 2011 census, 11% of the population of Northern Ireland claimed "some knowledge of Irish" and 3.7% reported being able to "speak, read, write and understand" Irish. In another survey, from 1999, 1% of respondents said they spoke it as their main language at home. The dialect spoken in Northern Ireland, Ulster Irish, has two main types, East Ulster Irish and Donegal Irish (or West Ulster Irish), is the one closest to Scottish Gaelic (which developed into a separate language from Irish Gaelic in the 17th century). Some words and phrases are shared with Scots Gaelic, and the dialects of east Ulster – those of Rathlin Island and the Glens of Antrim – were very similar to the dialect of Argyll, the part of Scotland nearest to Ireland. And those dialects of Armagh and Down were also very similar to the dialects of Galloway. Use of the Irish language in Northern Ireland today is politically sensitive. The erection by some district councils of bilingual street names in both English and Irish, invariably in predominantly nationalist districts, is resisted by unionists who claim that it creates a "chill factor" and thus harms community relationships. Efforts by members of the Northern Ireland Assembly to legislate for some official uses of the language have failed to achieve the required cross-community support, and the UK government has declined to legislate. There has recently been an increase in interest in the language among unionists in East Belfast. Ulster Scots Ulster Scots comprises varieties of the Scots language spoken in Northern Ireland. For a native English speaker, "[Ulster Scots] is comparatively accessible, and even at its most intense can be understood fairly easily with the help of a glossary." Along with the Irish language, the Good Friday Agreement recognised the dialect as part of Northern Ireland's unique culture and the St Andrews Agreement recognised the need to "enhance and develop the Ulster Scots language, heritage and culture". Approximately 2% of the population claim to speak Ulster Scots. However, the number speaking it as their main language in their home is negligible, with only 0.9% of 2011 census respondents claiming to be able to speak, read, write and understand Ulster-Scots. 8.1% professed to have "some ability" however. Sign languages The most common sign language in Northern Ireland is Northern Ireland Sign Language (NISL). However, because in the past Catholic families tended to send their deaf children to schools in Dublin where Irish Sign Language (ISL) is commonly used, ISL is still common among many older deaf people from Catholic families. Irish Sign Language (ISL) has some influence from the French family of sign language, which includes American Sign Language (ASL). NISL takes a large component from the British family of sign language (which also includes Auslan) with many borrowings from ASL. It is described as being related to Irish Sign Language at the syntactic level while much of the lexicon is based on British Sign Language (BSL). the British Government recognises only British Sign Language and Irish Sign Language as the official sign languages used in Northern Ireland. Culture Northern Ireland shares both the culture of Ireland and the culture of the United Kingdom. Parades are a prominent feature of Northern Ireland society, more so than in the rest of Ireland or in Britain. Most are held by Protestant fraternities such as the Orange Order, and Ulster loyalist marching bands. Each summer, during the "marching season", these groups have hundreds of parades, deck streets with British flags, bunting and specially-made arches, and light large towering bonfires in the "Eleventh Night" celebrations. The biggest parades are held on 12 July (The Twelfth). There is often tension when these activities take place near Catholic neighbourhoods, which sometimes leads to violence. Since the end of the Troubles, Northern Ireland has witnessed rising numbers of tourists. Attractions include cultural festivals, musical and artistic traditions, countryside and geographical sites of interest, public houses, welcoming hospitality and sports (especially golf and fishing). Since 1987 public houses have been allowed to open on Sundays, despite some opposition. The Ulster Cycle is a large body of prose and verse centring on the traditional heroes of the Ulaid in what is now eastern Ulster. This is one of the four major cycles of Irish mythology. The cycle centres on the reign of Conchobar mac Nessa, who is said to have been king of Ulster around the 1st century. He ruled from Emain Macha (now Navan Fort near Armagh), and had a fierce rivalry with queen Medb and king Ailill of Connacht and their ally, Fergus mac Róich, former king of Ulster. The foremost hero of the cycle is Conchobar's nephew Cúchulainn, who features in the epic prose/poem An Táin Bó Cúailnge (The Cattle Raid of Cooley, a casus belli between Ulster and Connaught). Symbols Northern Ireland comprises a patchwork of communities whose national loyalties are represented in some areas by flags flown from flagpoles or lamp posts. The Union Jack and the former Northern Ireland flag are flown in many loyalist areas, and the Tricolour, adopted by republicans as the flag of Ireland in 1916, is flown in some republican areas. Even kerbstones in some areas are painted red-white-blue or green-white-orange, depending on whether local people express unionist/loyalist or nationalist/republican sympathies. The official flag is that of the state having sovereignty over the territory, i.e. the Union Flag. The former Northern Ireland flag, also known as the "Ulster Banner" or "Red Hand Flag", is a banner derived from the coat of arms of the Government of Northern Ireland until 1972. Since 1972, it has had no official status. The Union Flag and the Ulster Banner are used exclusively by unionists. UK flags policy states that in Northern Ireland, "The Ulster flag and the Cross of St Patrick have no official status and, under the Flags Regulations, are not permitted to be flown from Government Buildings."Northern Irish flags from the World Flag Database . The Irish Rugby Football Union and the Church of Ireland have used the Saint Patrick's Saltire or "Cross of St Patrick". This red saltire on a white field was used to represent Ireland in the flag of the United Kingdom. It is still used by some British army regiments. Foreign flags are also found, such as the Palestinian flags in some nationalist areas and Israeli flags in some unionist areas. The United Kingdom national anthem of "God Save the Queen" is often played at state events in Northern Ireland. At the Commonwealth Games and some other sporting events, the Northern Ireland team uses the Ulster Banner as its flag—notwithstanding its lack of official status—and the Londonderry Air (usually set to lyrics as Danny Boy), which also has no official status, as its national anthem.Sport, Sectarianism and Society in a Divided Ireland by John Sugden and Alan Bairner (), p60 The Northern Ireland national football team also uses the Ulster Banner as its flag but uses "God Save The Queen" as its anthem. Major Gaelic Athletic Association matches are opened by the national anthem of the Republic of Ireland, "Amhrán na bhFiann (The Soldier's Song)", which is also used by most other all-Ireland sporting organisations. Since 1995, the Ireland rugby union team has used a specially commissioned song, "Ireland's Call" as the team's anthem. The Irish national anthem is also played at Dublin home matches, being the anthem of the host country. Northern Irish murals have become well-known features of Northern Ireland, depicting past and present events and documenting peace and cultural diversity. Almost 2,000 murals have been documented in Northern Ireland since the 1970s. Sport In Northern Ireland, sport is popular and important in the lives of many people. Sports tend to be organised on an all-Ireland basis, with a single team for the whole island. The most notable exception is association football, which has separate governing bodies for each jurisdiction. Field sports Association football The Irish Football Association (IFA) serves as the organising body for association football in Northern Ireland, with the Northern Ireland Football League (NIFL) responsible for the independent administration of the three divisions of national domestic football, as well as the Northern Ireland Football League Cup. The highest level of competition within Northern Ireland are the NIFL Premiership and the NIFL Championship. However, many players from Northern Ireland compete with clubs in England and Scotland. NIFL clubs are semi-professional or Intermediate.NIFL Premiership clubs are also eligible to compete in the UEFA Champions League and UEFA Europa League with the league champions entering the Champions league second qualifying round and the 2nd placed league finisher, the European play-off winners and the Irish Cup winners entering the Europa League second qualifying round. No clubs have ever reached the group stage. Despite Northern Ireland's small population, the Northern Ireland national football team qualified for the 1958 FIFA World Cup, 1982 FIFA World Cup and 1986 FIFA World Cup, making it to the quarter-finals in 1958 and 1982 and made it the first knockout round in the European Championships in 2016. Rugby union The six counties of Northern Ireland are among the nine governed by the Ulster branch of the Irish Rugby Football Union, the governing body of rugby union in Ireland. Ulster is one of the four professional provincial teams in Ireland and competes in the United Rugby Championship and European Cup. It won the European Cup in 1999. In international competitions, the Ireland national rugby union team's recent successes include four Triple Crowns between 2004 and 2009 and a Grand Slam in 2009 in the Six Nations Championship. Cricket The Ireland cricket team represents both Northern Ireland and the Republic of Ireland. It is a full member of the International Cricket Council, having been granted Test status and full membership by the ICC in June 2017. The side competes in Test cricket, the highest level of competitive cricket in the international arena, and are one of the 12 full-member countries of the ICC. Ireland men's side has played in the Cricket World Cup and T20 World Cup and has won the ICC Intercontinental Cup four times. The women's side has played in the Women's World Cup. One of the men's side's regular international venues is Stormont in Belfast. Gaelic games Gaelic games include Gaelic football, hurling (and camogie), Gaelic handball and rounders. Of the four, football is the most popular in Northern Ireland. Players play for local clubs with the best being selected for their county teams. The Ulster GAA is the branch of the Gaelic Athletic Association that is responsible for the nine counties of Ulster, which include the six of Northern Ireland. These nine county teams participate in the Ulster Senior Football Championship, Ulster Senior Hurling Championship, All-Ireland Senior Football Championship and All-Ireland Senior Hurling Championship. Recent successes for Northern Ireland teams include Armagh's 2002 All-Ireland Senior Football Championship win and Tyrone GAA's wins in 2003, 2005, 2008 and 2021. Golf Perhaps Northern Ireland's most notable successes in professional sport have come in golf. Northern Ireland has contributed more major champions in the modern era than any other European country, with three in the space of just 14 months from the U.S. Open in 2010 to The Open Championship in 2011. Notable golfers include Fred Daly (winner of The Open in 1947), Ryder Cup players Ronan Rafferty and David Feherty, leading European Tour professionals David Jones, Michael Hoey (a five-time winner on the tour) and Gareth Maybin, as well as three recent major winners Graeme McDowell (winner of the U.S. Open in 2010, the first European to do so since 1970), Rory McIlroy (winner of four majors) and Darren Clarke (winner of The Open in 2011). Northern Ireland has also contributed several players to the Great Britain and Ireland Walker Cup team, including Alan Dunbar and Paul Cutler who played on the victorious 2011 team in Scotland. Dunbar also won The Amateur Championship in 2012, at Royal Troon. The Golfing Union of Ireland, the governing body for men's and boy's amateur golf throughout Ireland and the oldest golfing union in the world, was founded in Belfast in 1891. Northern Ireland's golf courses include the Royal Belfast Golf Club (the earliest, formed in 1881), Royal Portrush Golf Club, which is the only course outside Great Britain to have hosted The Open Championship, and Royal County Down Golf Club (Golf Digest magazine's top-rated course outside the United States). Snooker Northern Ireland has produced two world snooker champions; Alex Higgins, who won the title in 1972 and 1982, and Dennis Taylor, who won in 1985. The highest-ranked Northern Ireland professional on the world circuit presently is Mark Allen from County Antrim. The sport is governed locally by the Northern Ireland Billiards and Snooker Association who run regular ranking tournaments and competitions. Motorsport Motorcycle racing Motorcycle racing is a particularly popular sport during the summer months, with the main meetings of the season attracting some of the largest crowds to any outdoor sporting event in the whole of Ireland. Two of the three major international road race meetings are held in Northern Ireland, these being the North West 200 and the Ulster Grand Prix. In addition racing on purpose built circuits take place at Kirkistown and Bishop's Court, whilst smaller road race meetings are held such as the Cookstown 100, the Armoy Road Races and the Tandragee 100 all of which form part of the Irish National Road Race Championships and which have produced some of the greatest motorcycle racers in the history of the sport, notably Joey Dunlop. Motor racing Although Northern Ireland lacks an international automobile racecourse, two Northern Irish | Ireland a few months after the Ulster Volunteers. Ireland seemed to be on the brink of civil war. Unionists were in a minority in Ireland as a whole, but a majority in the province of Ulster, especially the counties Antrim, Down, Armagh and Londonderry. Unionists argued that if Home Rule could not be stopped then all or part of Ulster should be excluded from it. In May 1914, the British government introduced an Amending Bill to allow for 'Ulster' to be excluded from Home Rule. There was then debate over how much of Ulster should be excluded and for how long. Some Ulster unionists were willing to tolerate the 'loss' of some mainly-Catholic areas of the province. The crisis was interrupted by the outbreak of the First World War in August 1914, and Ireland's involvement in it. The British government abandoned the Amending Bill, and instead rushed through a new bill, the Suspensory Act 1914, suspending Home Rule for the duration of the war,<ref>Hennessey, Thomas: Dividing Ireland, World War I and Partition, The passing of the Home Rule Bill p. 76, Routledge Press (1998) </ref> with the exclusion of Ulster still to be decided. Partition of Ireland By the end of the war (during which the 1916 Easter Rising had taken place), most Irish nationalists now wanted full independence rather than home rule. In September 1919, British Prime Minister David Lloyd George tasked a committee with planning another home rule bill. Headed by English unionist politician Walter Long, it was known as the 'Long Committee'. It decided that two devolved governments should be established—one for the nine counties of Ulster and one for the rest of Ireland—together with a Council of Ireland for the "encouragement of Irish unity". Most Ulster unionists wanted the territory of the Ulster government to be reduced to six counties, so that it would have a larger Protestant unionist majority. They feared that the territory would not last if it included too many Catholics and Irish nationalists. The six counties of Antrim, Down, Armagh, Londonderry, Tyrone and Fermanagh comprised the maximum area unionists believed they could dominate. Events overtook the government. In the 1918 Irish general election, the pro-independence Sinn Féin party won the overwhelming majority of Irish seats. Sinn Féin's elected members boycotted the British parliament and founded a separate Irish parliament (Dáil Éireann), declaring an independent Irish Republic covering the whole island. Many Irish republicans blamed the British establishment for the sectarian divisions in Ireland, and believed that Ulster Unionist defiance would fade once British rule was ended. The British authorities outlawed the Dáil in September 1919, and a guerrilla conflict developed as the Irish Republican Army (IRA) began attacking British forces. This became known as the Irish War of Independence.Gibney, John (editor). The Irish War of Independence and Civil War. Pen and Sword History, 2020. pp.xii–xiii Meanwhile, the Government of Ireland Act 1920 passed through the British parliament in 1920. It would divide Ireland into two self-governing UK territories: the six northeastern counties (Northern Ireland) being ruled from Belfast, and the other twenty-six counties (Southern Ireland) being ruled from Dublin. Both would have a shared Lord Lieutenant of Ireland, who would appoint both governments and a Council of Ireland, which the British government intended to evolve into an all-Ireland parliament. The Act received royal assent that December, becoming the Government of Ireland Act 1920. It came into force on 3 May 1921,Jackson, Alvin. Home Rule – An Irish History. Oxford University Press, 2004, pp. 368–370 partitioning Ireland and creating Northern Ireland. Elections to the Northern parliament were held on 24 May, in which Unionists won most seats. Its parliament first met on 7 June and formed its first devolved government, headed by Unionist Party leader James Craig. Republican and nationalist members refused to attend. King George V addressed the ceremonial opening of the Northern parliament on 22 June. During 1920–22, in what became Northern Ireland, partition was accompanied by violence "in defence or opposition to the new settlement". The IRA carried out attacks on British forces in the north-east, but was less active than in the south of Ireland. Protestant loyalists attacked the Catholic community in reprisal for IRA actions. In summer 1920, sectarian violence erupted in Belfast and Derry, and there were mass burnings of Catholic property in Lisburn and Banbridge. Conflict continued intermittently for two years, mostly in Belfast, which saw "savage and unprecedented" communal violence between Protestant and Catholic civilians. There was rioting, gun battles and bombings. Homes, business and churches were attacked and people were expelled from workplaces and from mixed neighbourhoods. More than 500 were killed and more than 10,000 became refugees, most of them Catholics. The British Army was deployed and the Ulster Special Constabulary (USC) was formed to help the regular police. The USC was almost wholly Protestant and some of its members carried out reprisal attacks on Catholics. A truce between British forces and the IRA was established on 11 July 1921, ending the fighting in most of Ireland. However, communal violence continued in Belfast, and in 1922 the IRA launched a guerrilla offensive in border areas of Northern Ireland. The Anglo-Irish Treaty was signed between representatives of the British Government and the Irish Republic on 6 December 1921. This created the Irish Free State. Under the terms of the treaty, Northern Ireland would become part of the Free State unless the government opted out by presenting an address to the king, although in practice partition remained in place. As expected, the Parliament of Northern Ireland resolved on 7 December 1922 (the day after the establishment of the Irish Free State) to exercise its right to opt out of the Free State by making an address to King George V. The text of the address was: Shortly afterwards, the Irish Boundary Commission was established to decide on the border between the Irish Free State and Northern Ireland. Owing to the outbreak of the Irish Civil War, the work of the commission was delayed until 1925. The Free State government and Irish nationalists hoped for a large transfer of territory to the Free State, as many border areas had nationalist majorities, leaving the remaining Northern Ireland too small to be viable. However, the commission's final report recommended only small transfers of territory, and in both directions. The Free State, Northern Ireland and UK governments agreed to suppress the report and accept the status quo, while the UK government agreed that the Free State would no longer have to pay its share of the UK national debt. 1925–1965 Northern Ireland's border was drawn to give it "a decisive Protestant majority". At the time of its creation, Northern Ireland's population was two-thirds Protestant and one-third Catholic. Most Protestants were unionists/loyalists who sought to maintain Northern Ireland as a part of the United Kingdom, while most Catholics were Irish nationalists/republicans who sought an independent United Ireland. There was mutual self-imposed segregation in Northern Ireland between Protestants and Catholics such as in education, housing and often employment. For its first fifty years, Northern Ireland had an unbroken series of Unionist Party governments. Almost every minister of these governments were members of the Protestant Orange Order. Almost all judges and magistrates were Protestant, many of them closely associated with the Unionist Party. Northern Ireland's new police force was the Royal Ulster Constabulary (RUC), which succeeded the Royal Irish Constabulary (RIC). It too was almost wholly Protestant and lacked operational independence, responding to directions from government ministers. The RUC and the reserve Ulster Special Constabulary (USC) were militarized police forces due to the threat from the IRA. They "had at their disposal the Special Powers Act, a sweeping piece of legislation which allowed arrests without warrant, internment without trial, unlimited search powers and bans on meetings and publications". The Nationalist Party was the main political party in opposition to the Unionist governments. However, its elected members often protested by abstaining from the Northern Ireland parliament, and many nationalists did not vote in parliamentary elections. Other early nationalist groups which campaigned against partition included the National League of the North (formed in 1928), the Northern Council for Unity (formed in 1937) and the Irish Anti-Partition League (formed in 1945). The Unionist governments, and some unionist-dominated local authorities, were accused of discriminating against the Catholic and Irish nationalist minority; especially over gerrymandering of electoral boundaries, the allocation of public housing, public sector employment, and policing. While some individual accusations were unfounded or exaggerated, there are enough proven cases to show "a consistent and irrefutable pattern of deliberate discrimination against Catholics". In June 1940, to encourage the neutral Irish state to join with the Allies of World War II, British Prime Minister Winston Churchill indicated to the Taoiseach Éamon de Valera that the United Kingdom would push for Irish unity, but believing that Churchill could not deliver, de Valera declined the offer. The British did not inform the Government of Northern Ireland that they had made the offer to the Dublin government, and de Valera's rejection was not publicised until 1970. The Ireland Act 1949 gave the first legal guarantee that the region would not cease to be part of the United Kingdom without the consent of the Parliament of Northern Ireland. From 1956 to 1962, the Irish Republican Army (IRA) carried out a limited guerrilla campaign in border areas of Northern Ireland, called the Border Campaign. It aimed to destabilize Northern Ireland and bring about an end to partition, but ended in failure. In 1965, Northern Ireland's Prime Minister Terence O'Neill met the Taoiseach, Seán Lemass. It was the first meeting between the two heads of government since partition. The Troubles The Troubles, which started in the late 1960s, consisted of about 30 years of recurring acts of intense violence during which 3,254 people were killed with over 50,000 casualties. From 1969 to 2003 there were over 36,900 shooting incidents and over 16,200 bombings or attempted bombings associated with The Troubles. The conflict was caused by the disputed status of Northern Ireland within the United Kingdom and the discrimination against the Irish nationalist minority by the dominant unionist majority. From 1967 to 1972 the Northern Ireland Civil Rights Association (NICRA), which modelled itself on the US civil rights movement, led a campaign of civil resistance to anti-Catholic discrimination in housing, employment, policing, and electoral procedures. The franchise for local government elections included only rate-payers and their spouses, and so excluded over a quarter of the electorate. While the majority of disenfranchised electors were Protestant, Catholics were over-represented since they were poorer and had more adults still living in the family home. NICRA's campaign, seen by many unionists as an Irish republican front, and the violent reaction to it proved to be a precursor to a more violent period. As early as 1969, armed campaigns of paramilitary groups began, including the Provisional IRA campaign of 1969–1997 which was aimed at the end of British rule in Northern Ireland and the creation of a United Ireland, and the Ulster Volunteer Force, formed in 1966 in response to the perceived erosion of both the British character and unionist domination of Northern Ireland. The state security forces – the British Army and the police (the Royal Ulster Constabulary) – were also involved in the violence. The British government's position is that its forces were neutral in the conflict, trying to uphold law and order in Northern Ireland and the right of the people of Northern Ireland to democratic self-determination. Republicans regarded the state forces as combatants in the conflict, pointing to the collusion between the state forces and the loyalist paramilitaries as proof of this. The "Ballast" investigation by the Police Ombudsman has confirmed that British forces, and in particular the RUC, did collude with loyalist paramilitaries, were involved in murder, and did obstruct the course of justice when such claims had been investigated, although the extent to which such collusion occurred is still disputed. As a consequence of the worsening security situation, autonomous regional government for Northern Ireland was suspended in 1972. Alongside the violence, there was a political deadlock between the major political parties in Northern Ireland, including those who condemned violence, over the future status of Northern Ireland and the form of government there should be within Northern Ireland. In 1973, Northern Ireland held a referendum to determine if it should remain in the United Kingdom, or be part of a united Ireland. The vote went heavily in favour (98.9%) of maintaining the status quo. Approximately 57.5% of the total electorate voted in support, but only 1% of Catholics voted following a boycott organised by the Social Democratic and Labour Party (SDLP). Peace process The Troubles were brought to an uneasy end by a peace process which included the declaration of ceasefires by most paramilitary organisations and the complete decommissioning of their weapons, the reform of the police, and the corresponding withdrawal of army troops from the streets and from sensitive border areas such as South Armagh and Fermanagh, as agreed by the signatories to the Belfast Agreement (commonly known as the "Good Friday Agreement"). This reiterated the long-held British position, which had never before been fully acknowledged by successive Irish governments, that Northern Ireland will remain within the United Kingdom until a majority of voters in Northern Ireland decides otherwise. The Constitution of Ireland was amended in 1999 to remove a claim of the "Irish nation" to sovereignty over the entire island (in Article 2). The new Articles 2 and 3, added to the Constitution to replace the earlier articles, implicitly acknowledge that the status of Northern Ireland, and its relationships within the rest of the United Kingdom and with the Republic of Ireland, would only be changed with the agreement of a majority of voters in each jurisdiction. This aspect was also central to the Belfast Agreement which was signed in 1998 and ratified by referendums held simultaneously in both Northern Ireland and the Republic. At the same time, the British Government recognised for the first time, as part of the prospective, the so-called "Irish dimension": the principle that the people of the island of Ireland as a whole have the right, without any outside interference, to solve the issues between North and South by mutual consent. The latter statement was key to winning support for the agreement from nationalists. It established a devolved power-sharing government within Northern Ireland, which must consist of both unionist and nationalist parties. These institutions were suspended by the British Government in 2002 after Police Service of Northern Ireland (PSNI) allegations of spying by people working for Sinn Féin at the Assembly (Stormontgate). The resulting case against the accused Sinn Féin member collapsed. On 28 July 2005, the Provisional IRA declared an end to its campaign and has since decommissioned what is thought to be all of its arsenal. This final act of decommissioning was performed under the watch of the Independent International Commission on Decommissioning (IICD) and two external church witnesses. Many unionists, however, remained sceptical. The IICD later confirmed that the main loyalist paramilitary groups, the Ulster Defence Association, UVF and the Red Hand Commando, had decommissioned what is thought to be all of their arsenals, witnessed by former archbishop Robin Eames and a former top civil servant. Politicians elected to the Assembly at the 2003 Assembly election were called together on 15 May 2006 under the Northern Ireland Act 2006 for the purpose of electing a First Minister and deputy First Minister of Northern Ireland and choosing the members of an Executive (before 25 November 2006) as a preliminary step to the restoration of devolved government. Following the election held on 7 March 2007, devolved government returned on 8 May 2007 with Democratic Unionist Party (DUP) leader Ian Paisley and Sinn Féin deputy leader Martin McGuinness taking office as First Minister and deputy First Minister, respectively. In its white paper on Brexit the United Kingdom government reiterated its commitment to the Belfast Agreement. With regard to Northern Ireland's status, it said that the UK Government's "clearly-stated preference is to retain Northern Ireland’s current constitutional position: as part of the UK, but with strong links to Ireland". Politics Background The main political divide in Northern Ireland is between unionists, who wish to see Northern Ireland continue as part of the United Kingdom, and nationalists, who wish to see Northern Ireland unified with the Republic of Ireland, independent from the United Kingdom. These two opposing views are linked to deeper cultural divisions. Unionists are predominantly Ulster Protestant, descendants of mainly Scottish, English, and Huguenot settlers as well as Gaels who converted to one of the Protestant denominations. Nationalists are overwhelmingly Catholic and descend from the population predating the settlement, with a minority from the Scottish Highlands as well as some converts from Protestantism. Discrimination against nationalists under the Stormont government (1921–1972) gave rise to the civil rights movement in the 1960s. While some unionists argue that discrimination was not just due to religious or political bigotry, but also the result of more complex socio-economic, socio-political and geographical factors, its existence, and the manner in which nationalist anger at it was handled, were a major contributing factor to the Troubles. The political unrest went through its most violent phase between 1968 and 1994. In 2007, 36% of the population defined themselves as unionist, 24% as nationalist and 40% defined themselves as neither. According to a 2015 opinion poll, 70% express a long-term preference of the maintenance of Northern Ireland's membership of the United Kingdom (either directly ruled or with devolved government), while 14% express a preference for membership of a united Ireland. This discrepancy can be explained by the overwhelming preference among Protestants to remain a part of the UK (93%), while Catholic preferences are spread across a number of solutions to the constitutional question including remaining a part of the UK (47%), a united Ireland (32%), Northern Ireland becoming an independent state (4%), and those who "don't know" (16%). Official voting figures, which reflect views on the "national question" along with issues of candidate, geography, personal loyalty and historic voting patterns, show 54% of Northern Ireland voters vote for unionist parties, 42% vote for nationalist parties and 4% vote "other". Opinion polls consistently show that the election results are not necessarily an indication of the electorate's stance regarding the constitutional status of Northern Ireland. Most of the population of Northern Ireland are at least nominally Christian, mostly Roman Catholic and Protestant denominations. Many voters (regardless of religious affiliation) are attracted to unionism's conservative policies, while other voters are instead attracted to the traditionally leftist Sinn Féin and SDLP and their respective party platforms for democratic socialism |
prosecuted its members as collaborators. Nearly 50,000 were brought to trial, approximately half of whom received prison sentences. The authorities executed Quisling for treason as well as a few other high-profile NS members, and prominent German officials in Norway, for war crimes. The sentences' lawfulness has been questioned, however, as Norway did not have capital punishment in peace-time, and the Norwegian constitution at the time stipulated that capital punishment for war crimes had to be carried out during actual wartime. Another issue of post-war treatment has been the ongoing Hamsun debate in Norway. The internationally renowned author Knut Hamsun, although never a member, was a well-known NS sympathiser. After the war, Hamsun was, however, deemed mentally unfit to stand trial, and the issue of his links to the party has never been properly resolved. Hamsun's status as a Nobel Prize laureate and probably the best-known Norwegian author next to Henrik Ibsen also results in his ties to NS being a touchy subject, as many feel the valuation of Hamsun's literature should not be marred by constant debate about whether or not he was a fascist. Uniforms and insignia Parliamentary elections References Further reading Larsen, Stein Ugelvik. "Charisma from Below? The Quisling Case in Norway." Totalitarian Movements and Political Religions 7#2 (2006): 235–244. Larsen, Stein Ugelvik, "The Social Foundations of Norwegian Fascism 1933–1945: An Analysis of Membership Data" in Stein Ugelvik Larsen, Bernt Hagtvet, and Jan Petter Myklebust, eds. Who were the fascists: social roots of European fascism (Columbia University Press, 1980). Hamre, Martin Kristoffer, "Norwegian Fascism in a Transnational Perspective: The Influence of German National Socialism and Italian Fascism on the Nasjonal Samling, | painted on his shield. During the German occupation When Germany invaded Norway in April 1940, Quisling marched into the Norwegian Broadcasting Corporation studios in Oslo and made a radio broadcast proclaiming himself Prime Minister and ordering all anti-German resistance to end immediately. However, King Haakon VII, in unoccupied territory along with the Nygaardsvold government, let it be known he would abdicate rather than appoint any government headed by Quisling. The Nygaardsvold government refused to step down in Quisling's favour or serve under him, and confirmed that resistance was to be continued. With no popular support, the German forces of occupation quickly thrust Quisling aside. In December 1940 membership rose to 22,000, and peaked with 43,400 in November 1943. After a brief period with a civilian caretaker government (Administrasjonsrådet) appointed by the Supreme Court, the Germans took control through Reichskommissar Josef Terboven. He appointed a government responsible to himself, with most ministers from the ranks of Nasjonal Samling. However, the party leader, Quisling, was controversial in Norway as well as among the occupiers, and was denied a formal position until 1 February 1942, when he became "minister president" of the "national government". Other important ministers were Jonas Lie (also head of the Norwegian wing of the SS from 1941) as minister of police, Gulbrand Lunde as minister of "popular enlightenment and propaganda", and the opera singer Albert Viljam Hagelin, who was Minister of Home Affairs. The NS administration had a certain amount of autonomy in purely civilian matters, but it was in reality controlled by the Reichskommissar as "head of state", subordinate only to Adolf Hitler. Post-war The post-war authorities proscribed the party and |
prepared and so half of them answered "yes" while the other half replied "no". So Nasreddin said Let the half who know what I am going to say, tell it to the half who don't, and left. Whom do you believe? A neighbour came to the gate of Hodja Nasreddin's yard. The Hodja went to meet him outside. "Would you mind, Hodja," the neighbour asked, "can you lend me your donkey today? I have some goods to transport to the next town." The Hodja didn't feel inclined to lend out the animal to that particular man, however. So, not to seem rude, he answered: "I'm sorry, but I've already lent him to somebody else." All of a sudden the donkey could be heard braying loudly behind the wall of the yard. "But Hodja," the neighbour exclaimed. "I can hear it behind that wall!" "Whom do you believe," the Hodja replied indignantly, "the donkey or your Hodja?" Taste the same Some children saw Nasreddin coming from the vineyard with two baskets full of grapes loaded on his donkey. They gathered around him and asked him to give them a taste. Nasreddin picked up a bunch of grapes and gave each child a grape. "You have so much, but you gave us so little," the children whined. "There is no difference whether you have a basketful or a small piece. They all taste the same," Nasreddin answered, and continued on his way. Nasreddin's ring Mulla had lost his ring in the living room. He searched for it for a while, but since he could not find it, he went out into the yard and began to look there. His wife, who saw what he was doing, asked: "Mulla, you lost your ring in the room, why are you looking for it in the yard?” Mulla stroked his beard and said: "The room is too dark and I can’t see very well. I came out to the courtyard to look for my ring because there is much more light out here". In Asian and Caucasus folk tradition and literature For Uzbek people, Nasreddin is one of their own; he is said to have lived and been born in Bukhara. In gatherings, family meetings, and parties they tell each other stories about him that are called "latifa" of "afandi". There are at least two collections of Uzbek stories related to Nasriddin Afandi: "Afandining qirq bir passhasi" – (Forty-one flies of Afandi) – Zohir A'lam, Tashkent "Afandining besh xotini" – (Five wives of Afandi) Nasreddin was the main character in a magazine, called simply Molla Nasraddin, published in Azerbaijan and "read across the Muslim world from Morocco to Iran". The eight-page Azerbaijani satirical periodical was published in Tiflis (from 1906 to 1917), Tabriz (in 1921) and Baku (from 1922 to 1931) in the Azeri and occasionally Russian languages. Founded by Jalil Mammadguluzadeh, it depicted inequality, cultural assimilation, and corruption and ridiculed the backward lifestyles and values of clergy and religious fanatics. The magazine was frequently banned but had a lasting influence on Azerbaijani and Iranian literature. He is known as Mullah Nasruddin in South Asian children's books. A TV serial on him was aired in India as Mulla Nasiruddin and was widely watched in India and Pakistan. In European and Western folk tradition and literature Some Nasreddin tales also appear in collections of Aesop's fables. The miller, his son and the donkey is one example. Others are "The Ass with a Burden of Salt" (Perry Index 180) and "The Satyr and the Traveller." In some Bulgarian folk tales that originated during the Ottoman period, the name appears as an antagonist to a local wise man, named Sly Peter. In Sicily the same tales involve a man named Giufà. In Sephardic culture, spread throughout the Ottoman Empire, a character that appears in many folk tales is named Djohá. In Romanian, the existing stories come from an 1853 verse compilation edited by Anton Pann, a philologist and poet renowned for authoring the current Romanian anthem. Nasreddin is mostly known as a character from short tales; however, he has also been featured in longer mediums, such as novels and films. In Russia, Nasreddin is known mostly because of the Russian work Возмутитель спокойствия by Leonid Solovyov (English translations: "The Beggar in the Harem: Impudent Adventures in Old Bukhara", 1956, and "The Tale of Hodja Nasreddin: Disturber of the Peace", 2009). The composer Shostakovich celebrated Nasreddin, among other figures, in the second movement (Yumor, "Humor") of his Symphony No. 13. The text, by Yevgeny Yevtushenko, portrays humor as a weapon against dictatorship and tyranny. Shostakovich's music shares many of the "foolish yet profound" qualities of Nasreddin's sayings listed above. The Graeco-Armenian mystic G. I. Gurdjieff often referred to "our own dear Mullah Nasr Eddin", also calling him an "incomparable teacher", particularly in his book Beelzebub's Tales. Sufi philosopher Idries Shah published several collections of Nasruddin stories in English, and emphasized their teaching value. Film In 1943, the Soviet film Nasreddin in Bukhara was directed by Yakov Protazanov based on Solovyov's book, followed in 1947 by a film called The Adventures of Nasreddin, directed by Nabi Ganiyev and also set in the Uzbekistan SSR. In 1964, Richard Williams, a Canadian-British animator, began work on Nasrudin, an animated film based on the character. The film was produced with the help of Idries Shah, for whom Williams had illustrated books about the character; however, tensions between Williams' crew and the Shah family caused Williams to end his relationship with them, causing him to lose his right to use Nasreddin as a character. The unfinished film was later reworked into The Thief and the Cobbler, which had a similarly troubled production history. Collections Bacha,Mohamed30 Funny Stories of Joha, The Beloved Folk Hero of The East (bilingual English - Arabic) 600 Mulla Nasreddin Tales, collected by Mohammad Ramazani (Popular Persian Text Series: 1) (in Persian). Tales of the Hodja, retold by Charles Downing, illustrated by William Papas. Oxford University Press: London, 1964. The Exploits of the Incomparable Mulla Nasreddin, by Idries Shah, illustrated by Richard Williams The Subtleties of the Inimitable Mulla Nasreddin, by Idries Shah, illustrated by Richard Williams. The Pleasantries of the Incredible Mulla Nasrudin, by Idries Shah, illustrated by Richard Williams and Errol Le Cain Mullah Nasiruddiner Galpo (Tales of Mullah Nasreddin) collected and retold by Satyajit Ray, (in Bengali) The Wisdom of Mulla Nasruddin, by Shahrukh Husain Watermelons, Walnuts, and the Wisdom of Allah and Other Tales of the Hoca, by Barbara K. Walker, Illustrated by Harold Berson The Uncommon Sense of the Immortal Mullah Nasruddin: Stories, jests, and donkey | timelessness. They purvey a pithy folk wisdom that triumphs over all trials and tribulations. The oldest manuscript of Nasreddin dates to 1571. Some of the stories, however, can be traced at far back as the Philogelos and Aesop's fables. Today, Nasreddin stories are told in a wide variety of regions, especially across the Muslim world and have been translated into many languages. Some regions independently developed a character similar to Nasreddin, and the stories have become part of a larger whole. In many regions, Nasreddin is a major part of the culture, and is quoted or alluded to frequently in daily life. Since there are thousands of different Nasreddin stories, one can be found to fit almost any occasion. Nasreddin often appears as a whimsical character of a large Turkish, Persian, Albanian, Armenian, Azerbaijani, Bengali, Bosnian, Bulgarian, Chinese, Greek, Gujarati, Hindi, Judeo-Spanish, Kurdish, Romanian, Serbian, Russian, and Urdu folk tradition of vignettes, not entirely different from zen koans. 1996–1997 was declared International Nasreddin Year by UNESCO. Name Many peoples of the Near, Middle East, South Asia and Central Asia claim Nasreddin as their own (e.g., Turks, Afghans, Iranians, and Uzbeks). His name is spelt in a wide variety of ways: Nasrudeen, Nasrudin, Nasruddin, Nasr ud-Din, Nasredin, Nasiruddin, Naseeruddin, Nasr Eddin, Nastradhin, Nasreddine, Nastratin, Nusrettin, Nasrettin, Nostradin, Nastradin (lit.: Victory of the Deen) and Nazaruddin. It is sometimes preceded or followed by a title or honorific used in the corresponding cultures: "Hoxha", "Khwaje", "Hodja", "Hoja", "Hojja", "Hodscha", "Hodža", "Hoca", "Hocca","Hooka", "Hogea", "Mullah", "Mulla", "Mula", "Molla", "Efendi", "Afandi", "Ependi" ( 'afandī), "Hajji". In several cultures he is named by the title alone. In Arabic-speaking countries this character is known as "Juha", "Djoha", "Djuha", "Dschuha", "Chotzas", "Goha" ( juḥā). Juha was originally a separate folk character found in Arabic literature as early as the 9th century, and was widely popular by the 11th century. Lore of the two characters became amalgamated in the 19th century when collections were translated from Arabic into Turkish and Persian. In Sicily and Southern Italy he is known as "Giufà", derived from the Arabic character Juha. In the Swahili and Indonesian culture, many of his stories are being told under the name of "Abunuwasi" or "Abunawas", though this confuses Nasreddin with an entirely different man – the poet Abu Nuwas, known for homoerotic verse. In China, where stories of him are well known, he is known by the various transliterations from his Uyghur name, 阿凡提 (Āfántí) and 阿方提 (Āfāngtí). The Uyghurs believe that he was from Xinjiang, while the Uzbeks believe he was from Bukhara. Shanghai Animation Film Studio produced a 13-episode Nasreddin related animation called 'The Story of Afanti'/ 阿凡提 in 1979, which became one of the most influential animations in China's history. The musical Nasirdin Apandim features the legend of Nasreddin effendi ("sir, lord"), largely sourced from Uyghur folklore. In Central Asia, he is commonly known as "Afandi". The Central Asian peoples also claim his local origin, as do Uyghurs. Tales The Nasreddin stories are known throughout the Middle East and have touched cultures around the world. Superficially, most of the Nasreddin stories may be told as jokes or humorous anecdotes. They are told and retold endlessly in the teahouses and caravanserais of Asia and can be heard in homes and on the radio. But it is inherent in a Nasreddin story that it may be understood at many levels. There is the joke, followed by a moral and usually the little extra which brings the consciousness of the potential mystic a little further on the way to realization. Examples The Sermon Once Nasreddin was invited to deliver a sermon. When he got on the pulpit, he asked, Do you know what I am going to say? The audience replied "no", so he announced, I have no desire to speak to people who don't even know what I will be talking about! and left. The people felt embarrassed and called him back again the next day. This time, when he asked the same question, the people replied yes. So Nasreddin said, Well, since you already know what I am going to say, I won't waste any more of your time! and left. Now the people were really perplexed. They decided to try one more time and once again invited the Mulla to speak the following week. Once again he asked the same question – Do you know what I am going to say? Now the people were prepared and so half of them answered "yes" while the other half replied "no". So Nasreddin said Let the half who know what I am going to say, tell it to the half who don't, and left. Whom do you believe? A neighbour came to the gate of Hodja Nasreddin's yard. The Hodja went to meet him outside. "Would you mind, Hodja," the neighbour asked, "can you lend me your donkey today? I have some goods to transport to the next town." The Hodja didn't feel inclined to lend out the animal to that particular man, however. So, not to seem rude, he answered: "I'm sorry, but I've already lent him to somebody else." All of a sudden the donkey could be heard braying loudly behind the wall of the yard. "But Hodja," the neighbour exclaimed. "I can hear it behind that wall!" "Whom do you believe," the Hodja |
subject to the Pauli exclusion principle; two neutrons cannot have the same quantum numbers. This is the source of the degeneracy pressure which makes neutron stars possible. Structure and geometry of charge distribution An article published in 2007 featuring a model-independent analysis concluded that the neutron has a negatively charged exterior, a positively charged middle, and a negative core. In a simplified classical view, the negative "skin" of the neutron assists it to be attracted to the protons with which it interacts in the nucleus; but the main attraction between neutrons and protons is via the nuclear force, which does not involve electric charge. The simplified classical view of the neutron's charge distribution also "explains" the fact that the neutron magnetic dipole points in the opposite direction from its spin angular momentum vector (as compared to the proton). This gives the neutron, in effect, a magnetic moment which resembles a negatively charged particle. This can be reconciled classically with a neutral neutron composed of a charge distribution in which the negative sub-parts of the neutron have a larger average radius of distribution, and therefore contribute more to the particle's magnetic dipole moment, than do the positive parts that are, on average, nearer the core. Electric dipole moment The Standard Model of particle physics predicts a tiny separation of positive and negative charge within the neutron leading to a permanent electric dipole moment. But the predicted value is well below the current sensitivity of experiments. From several unsolved puzzles in particle physics, it is clear that the Standard Model is not the final and full description of all particles and their interactions. New theories going beyond the Standard Model generally lead to much larger predictions for the electric dipole moment of the neutron. Currently, there are at least four experiments trying to measure for the first time a finite neutron electric dipole moment, including: Cryogenic neutron EDM experiment being set up at the Institut Laue–Langevin nEDM experiment under construction at the new UCN source at the Paul Scherrer Institute nEDM experiment being envisaged at the Spallation Neutron Source nEDM experiment being built at the Institut Laue–Langevin Antineutron The antineutron is the antiparticle of the neutron. It was discovered by Bruce Cork in 1956, a year after the antiproton was discovered. CPT-symmetry puts strong constraints on the relative properties of particles and antiparticles, so studying antineutrons provides stringent tests on CPT-symmetry. The fractional difference in the masses of the neutron and antineutron is . Since the difference is only about two standard deviations away from zero, this does not give any convincing evidence of CPT-violation. Neutron compounds Dineutrons and tetraneutrons The existence of stable clusters of 4 neutrons, or tetraneutrons, has been hypothesised by a team led by Francisco-Miguel Marqués at the CNRS Laboratory for Nuclear Physics based on observations of the disintegration of beryllium-14 nuclei. This is particularly interesting because current theory suggests that these clusters should not be stable. In February 2016, Japanese physicist Susumu Shimoura of the University of Tokyo and co-workers reported they had observed the purported tetraneutrons for the first time experimentally. Nuclear physicists around the world say this discovery, if confirmed, would be a milestone in the field of nuclear physics and certainly would deepen our understanding of the nuclear forces. The dineutron is another hypothetical particle. In 2012, Artemis Spyrou from Michigan State University and coworkers reported that they observed, for the first time, the dineutron emission in the decay of 16Be. The dineutron character is evidenced by a small emission angle between the two neutrons. The authors measured the two-neutron separation energy to be 1.35(10) MeV, in good agreement with shell model calculations, using standard interactions for this mass region. Neutronium and neutron stars At extremely high pressures and temperatures, nucleons and electrons are believed to collapse into bulk neutronic matter, called neutronium. This is presumed to happen in neutron stars. The extreme pressure inside a neutron star may deform the neutrons into a cubic symmetry, allowing tighter packing of neutrons. Detection The common means of detecting a charged particle by looking for a track of ionization (such as in a cloud chamber) does not work for neutrons directly. Neutrons that elastically scatter off atoms can create an ionization track that is detectable, but the experiments are not as simple to carry out; other means for detecting neutrons, consisting of allowing them to interact with atomic nuclei, are more commonly used. The commonly used methods to detect neutrons can therefore be categorized according to the nuclear processes relied upon, mainly neutron capture or elastic scattering. Neutron detection by neutron capture A common method for detecting neutrons involves converting the energy released from neutron capture reactions into electrical signals. Certain nuclides have a high neutron capture cross section, which is the probability of absorbing a neutron. Upon neutron capture, the compound nucleus emits more easily detectable radiation, for example an alpha particle, which is then detected. The nuclides , , , , , , and are useful for this purpose. Neutron detection by elastic scattering Neutrons can elastically scatter off nuclei, causing the struck nucleus to recoil. Kinematically, a neutron can transfer more energy to a light nucleus such as hydrogen or helium than to a heavier nucleus. Detectors relying on elastic scattering are called fast neutron detectors. Recoiling nuclei can ionize and excite further atoms through collisions. Charge and/or scintillation light produced in this way can be collected to produce a detected signal. A major challenge in fast neutron detection is discerning such signals from erroneous signals produced by gamma radiation in the same detector. Methods such as pulse shape discrimination can be used in distinguishing neutron signals from gamma-ray signals, although certain inorganic scintillator-based detectors have been developed to selectively detect neutrons in mixed radiation fields inherently without any additional techniques. Fast neutron detectors have the advantage of not requiring a moderator, and are therefore capable of measuring the neutron's energy, time of arrival, and in certain cases direction of incidence. Sources and production Free neutrons are unstable, although they have the longest half-life of any unstable subatomic particle by several orders of magnitude. Their half-life is still only about 10 minutes, so they can be obtained only from sources that produce them continuously. Natural neutron background. A small natural background flux of free neutrons exists everywhere on Earth. In the atmosphere and deep into the ocean, the "neutron background" is caused by muons produced by cosmic ray interaction with the atmosphere. These high-energy muons are capable of penetration to considerable depths in water and soil. There, in striking atomic nuclei, among other reactions they induce spallation reactions in which a neutron is liberated from the nucleus. Within the Earth's crust a second source is neutrons produced primarily by spontaneous fission of uranium and thorium present in crustal minerals. The neutron background is not strong enough to be a biological hazard, but it is of importance to very high resolution particle detectors that are looking for very rare events, such as (hypothesized) interactions that might be caused by particles of dark matter. Recent research has shown that even thunderstorms can produce neutrons with energies of up to several tens of MeV. Recent research has shown that the fluence of these neutrons lies between 10−9 and 10−13 per ms and per m2 depending on the detection altitude. The energy of most of these neutrons, even with initial energies of 20 MeV, decreases down to the keV range within 1 ms. Even stronger neutron background radiation is produced at the surface of Mars, where the atmosphere is thick enough to generate neutrons from cosmic ray muon production and neutron-spallation, but not thick enough to provide significant protection from the neutrons produced. These neutrons not only produce a Martian surface neutron radiation hazard from direct downward-going neutron radiation but may also produce a significant hazard from reflection of neutrons from the Martian surface, which will produce reflected neutron radiation penetrating upward into a Martian craft or habitat from the floor. Sources of neutrons for research. These include certain types of radioactive decay (spontaneous fission and neutron emission), and from certain nuclear reactions. Convenient nuclear reactions include tabletop reactions such as natural alpha and gamma bombardment of certain nuclides, often beryllium or deuterium, and induced nuclear fission, such as occurs in nuclear reactors. In addition, high-energy nuclear reactions (such as occur in cosmic radiation showers or accelerator collisions) also produce neutrons from disintegration of target nuclei. Small (tabletop) particle accelerators optimized to produce free neutrons in this way, are called neutron generators. In practice, the most commonly used small laboratory sources of neutrons use radioactive decay to power neutron production. One noted neutron-producing radioisotope, californium-252 decays (half-life 2.65 years) by spontaneous fission 3% of the time with production of 3.7 neutrons per fission, and is used alone as a neutron source from this process. Nuclear reaction sources (that involve two materials) powered by radioisotopes use an alpha decay source plus a beryllium target, or else a source of high-energy gamma radiation from a source that undergoes beta decay followed by gamma decay, which produces photoneutrons on interaction of the high-energy gamma ray with ordinary stable beryllium, or else with the deuterium in heavy water. A popular source of the latter type is radioactive antimony-124 plus beryllium, a system with a half-life of 60.9 days, which can be constructed from natural antimony (which is 42.8% stable antimony-123) by activating it with neutrons in a nuclear reactor, then transported to where the neutron source is needed. Nuclear fission reactors naturally produce free neutrons; their role is to sustain the energy-producing chain reaction. The intense neutron radiation can also be used to produce various radioisotopes through the process of neutron activation, which is a type of neutron capture. Experimental nuclear fusion reactors produce free neutrons as a waste product. But it is these neutrons that possess most of the energy, and converting that energy to a useful form has proved a difficult engineering challenge. Fusion reactors that generate neutrons are likely to create radioactive waste, but the waste is composed of neutron-activated lighter isotopes, which have relatively short (50–100 years) decay periods as compared to typical half-lives of 10,000 years for fission waste, which is long due primarily to the long half-life of alpha-emitting transuranic actinides. Some nuclear fusion-fission hybrids are proposed to make use of those neutrons to either maintain a subcritical reactor or to aid in nuclear transmutation of harmful long lived nuclear waste to shorter lived or stable nuclides. Neutron beams and modification of beams after production Free neutron beams are obtained from neutron sources by neutron transport. For access to intense neutron sources, researchers must go to a specialized neutron facility that operates a research reactor or a spallation source. The neutron's lack of total electric charge makes it difficult to steer or accelerate them. Charged particles can be accelerated, decelerated, or deflected by electric or magnetic fields. These methods have little effect on neutrons. But some effects may be attained by use of inhomogeneous magnetic fields because of the neutron's magnetic moment. Neutrons can be controlled by methods that include moderation, reflection, and velocity selection. Thermal neutrons can be polarized by transmission through magnetic materials in a method analogous to the Faraday effect for photons. Cold neutrons of wavelengths of 6–7 angstroms can be produced in beams of a high degree of polarization, by use of magnetic mirrors and magnetized interference filters. Applications The neutron plays an important role in many nuclear reactions. For example, neutron capture often results in neutron activation, inducing radioactivity. In particular, knowledge of neutrons and their behavior has been important in the development of nuclear reactors and nuclear weapons. The fissioning of elements like uranium-235 and plutonium-239 is caused by their absorption of neutrons. Cold, thermal, and hot neutron radiation is commonly employed in neutron scattering facilities, where the radiation is used in a similar way one uses X-rays for the analysis of condensed matter. Neutrons are complementary to the latter in terms of atomic contrasts by different scattering cross sections; sensitivity to magnetism; energy range for inelastic neutron spectroscopy; and deep penetration into matter. The development of "neutron lenses" based on total internal reflection within hollow glass capillary tubes or by reflection from dimpled aluminum plates has driven ongoing research into neutron microscopy and neutron/gamma ray tomography. A major use of neutrons is to excite delayed and prompt gamma rays from elements in materials. This forms the basis of neutron activation analysis (NAA) and prompt gamma neutron activation analysis (PGNAA). NAA is most often used to analyze small samples of materials in a nuclear reactor whilst PGNAA is most often used to analyze subterranean rocks around bore holes and industrial bulk materials on conveyor belts. Another use of neutron emitters is the detection of light nuclei, in particular the hydrogen found in water molecules. When a fast neutron collides with a light nucleus, it loses a large fraction of its energy. By measuring the rate at which slow neutrons return to the probe after reflecting off of hydrogen nuclei, a neutron probe may determine the water content in soil. Medical therapies Because neutron radiation is both penetrating and ionizing, it can be exploited for medical treatments. However, neutron radiation can have the unfortunate side-effect of leaving the affected area radioactive. Neutron tomography is therefore not a viable medical application. Fast neutron therapy uses high-energy neutrons typically greater than 20 MeV to treat cancer. Radiation therapy of cancers is based upon the biological response of cells to ionizing radiation. If radiation is delivered in small sessions to damage cancerous areas, normal tissue will have time to repair itself, while | the Standard Model is not the final and full description of all particles and their interactions. New theories going beyond the Standard Model generally lead to much larger predictions for the electric dipole moment of the neutron. Currently, there are at least four experiments trying to measure for the first time a finite neutron electric dipole moment, including: Cryogenic neutron EDM experiment being set up at the Institut Laue–Langevin nEDM experiment under construction at the new UCN source at the Paul Scherrer Institute nEDM experiment being envisaged at the Spallation Neutron Source nEDM experiment being built at the Institut Laue–Langevin Antineutron The antineutron is the antiparticle of the neutron. It was discovered by Bruce Cork in 1956, a year after the antiproton was discovered. CPT-symmetry puts strong constraints on the relative properties of particles and antiparticles, so studying antineutrons provides stringent tests on CPT-symmetry. The fractional difference in the masses of the neutron and antineutron is . Since the difference is only about two standard deviations away from zero, this does not give any convincing evidence of CPT-violation. Neutron compounds Dineutrons and tetraneutrons The existence of stable clusters of 4 neutrons, or tetraneutrons, has been hypothesised by a team led by Francisco-Miguel Marqués at the CNRS Laboratory for Nuclear Physics based on observations of the disintegration of beryllium-14 nuclei. This is particularly interesting because current theory suggests that these clusters should not be stable. In February 2016, Japanese physicist Susumu Shimoura of the University of Tokyo and co-workers reported they had observed the purported tetraneutrons for the first time experimentally. Nuclear physicists around the world say this discovery, if confirmed, would be a milestone in the field of nuclear physics and certainly would deepen our understanding of the nuclear forces. The dineutron is another hypothetical particle. In 2012, Artemis Spyrou from Michigan State University and coworkers reported that they observed, for the first time, the dineutron emission in the decay of 16Be. The dineutron character is evidenced by a small emission angle between the two neutrons. The authors measured the two-neutron separation energy to be 1.35(10) MeV, in good agreement with shell model calculations, using standard interactions for this mass region. Neutronium and neutron stars At extremely high pressures and temperatures, nucleons and electrons are believed to collapse into bulk neutronic matter, called neutronium. This is presumed to happen in neutron stars. The extreme pressure inside a neutron star may deform the neutrons into a cubic symmetry, allowing tighter packing of neutrons. Detection The common means of detecting a charged particle by looking for a track of ionization (such as in a cloud chamber) does not work for neutrons directly. Neutrons that elastically scatter off atoms can create an ionization track that is detectable, but the experiments are not as simple to carry out; other means for detecting neutrons, consisting of allowing them to interact with atomic nuclei, are more commonly used. The commonly used methods to detect neutrons can therefore be categorized according to the nuclear processes relied upon, mainly neutron capture or elastic scattering. Neutron detection by neutron capture A common method for detecting neutrons involves converting the energy released from neutron capture reactions into electrical signals. Certain nuclides have a high neutron capture cross section, which is the probability of absorbing a neutron. Upon neutron capture, the compound nucleus emits more easily detectable radiation, for example an alpha particle, which is then detected. The nuclides , , , , , , and are useful for this purpose. Neutron detection by elastic scattering Neutrons can elastically scatter off nuclei, causing the struck nucleus to recoil. Kinematically, a neutron can transfer more energy to a light nucleus such as hydrogen or helium than to a heavier nucleus. Detectors relying on elastic scattering are called fast neutron detectors. Recoiling nuclei can ionize and excite further atoms through collisions. Charge and/or scintillation light produced in this way can be collected to produce a detected signal. A major challenge in fast neutron detection is discerning such signals from erroneous signals produced by gamma radiation in the same detector. Methods such as pulse shape discrimination can be used in distinguishing neutron signals from gamma-ray signals, although certain inorganic scintillator-based detectors have been developed to selectively detect neutrons in mixed radiation fields inherently without any additional techniques. Fast neutron detectors have the advantage of not requiring a moderator, and are therefore capable of measuring the neutron's energy, time of arrival, and in certain cases direction of incidence. Sources and production Free neutrons are unstable, although they have the longest half-life of any unstable subatomic particle by several orders of magnitude. Their half-life is still only about 10 minutes, so they can be obtained only from sources that produce them continuously. Natural neutron background. A small natural background flux of free neutrons exists everywhere on Earth. In the atmosphere and deep into the ocean, the "neutron background" is caused by muons produced by cosmic ray interaction with the atmosphere. These high-energy muons are capable of penetration to considerable depths in water and soil. There, in striking atomic nuclei, among other reactions they induce spallation reactions in which a neutron is liberated from the nucleus. Within the Earth's crust a second source is neutrons produced primarily by spontaneous fission of uranium and thorium present in crustal minerals. The neutron background is not strong enough to be a biological hazard, but it is of importance to very high resolution particle detectors that are looking for very rare events, such as (hypothesized) interactions that might be caused by particles of dark matter. Recent research has shown that even thunderstorms can produce neutrons with energies of up to several tens of MeV. Recent research has shown that the fluence of these neutrons lies between 10−9 and 10−13 per ms and per m2 depending on the detection altitude. The energy of most of these neutrons, even with initial energies of 20 MeV, decreases down to the keV range within 1 ms. Even stronger neutron background radiation is produced at the surface of Mars, where the atmosphere is thick enough to generate neutrons from cosmic ray muon production and neutron-spallation, but not thick enough to provide significant protection from the neutrons produced. These neutrons not only produce a Martian surface neutron radiation hazard from direct downward-going neutron radiation but may also produce a significant hazard from reflection of neutrons from the Martian surface, which will produce reflected neutron radiation penetrating upward into a Martian craft or habitat from the floor. Sources of neutrons for research. These include certain types of radioactive decay (spontaneous fission and neutron emission), and from certain nuclear reactions. Convenient nuclear reactions include tabletop reactions such as natural alpha and gamma bombardment of certain nuclides, often beryllium or deuterium, and induced nuclear fission, such as occurs in nuclear reactors. In addition, high-energy nuclear reactions (such as occur in cosmic radiation showers or accelerator collisions) also produce neutrons from disintegration of target nuclei. Small (tabletop) particle accelerators optimized to produce free neutrons in this way, are called neutron generators. In practice, the most commonly used small laboratory sources of neutrons use radioactive decay to power neutron production. One noted neutron-producing radioisotope, californium-252 decays (half-life 2.65 years) by spontaneous fission 3% of the time with production of 3.7 neutrons per fission, and is used alone as a neutron source from this process. Nuclear reaction sources (that involve two materials) powered by radioisotopes use an alpha decay source plus a beryllium target, or else a source of high-energy gamma radiation from a source that undergoes beta decay followed by gamma decay, which produces photoneutrons on interaction of the high-energy gamma ray with ordinary stable beryllium, or else with the deuterium in heavy water. A popular source of the latter type is radioactive antimony-124 plus beryllium, a system with a half-life of 60.9 days, which can be constructed from natural antimony (which is 42.8% stable antimony-123) by activating it with neutrons in a nuclear reactor, then transported to where the neutron source is needed. Nuclear fission reactors naturally produce free neutrons; their role is to sustain the energy-producing chain reaction. The intense neutron radiation can also be used to produce various radioisotopes through the process of neutron activation, which is a type of neutron capture. Experimental nuclear fusion reactors produce free neutrons as a waste product. But it is these neutrons that possess most of the energy, and converting that energy to a useful form has proved a difficult engineering challenge. Fusion reactors that generate neutrons are likely to create radioactive waste, but the waste is composed of neutron-activated lighter isotopes, which have relatively short (50–100 years) decay periods as compared to typical half-lives of 10,000 years for fission waste, which is long due primarily to the long half-life of alpha-emitting transuranic actinides. Some nuclear fusion-fission hybrids are proposed to make use of those neutrons to either maintain a subcritical reactor or to aid in nuclear transmutation of harmful long lived nuclear waste to shorter lived or stable nuclides. Neutron beams and modification of beams after production Free neutron beams are obtained from neutron sources by neutron transport. For access to intense neutron sources, researchers must go to a specialized neutron facility that operates a research reactor or a spallation source. The neutron's lack of total electric charge makes it difficult to steer or accelerate them. Charged particles can be accelerated, decelerated, or deflected by electric or magnetic fields. These methods have little effect on neutrons. But some effects may be attained by use of inhomogeneous magnetic fields because of the neutron's magnetic moment. Neutrons can be controlled by methods that include moderation, reflection, and velocity selection. Thermal neutrons can be polarized by transmission through magnetic materials in a method analogous to the Faraday effect for photons. Cold neutrons of wavelengths of 6–7 angstroms can be produced in beams of a high degree of polarization, by use of magnetic mirrors and magnetized interference filters. Applications The neutron plays an important role in many nuclear reactions. For example, neutron capture often results in neutron activation, inducing radioactivity. In particular, knowledge of neutrons and their behavior has been important in the development of nuclear reactors and nuclear weapons. The fissioning of elements like uranium-235 and plutonium-239 is caused by their absorption of neutrons. Cold, thermal, and hot neutron radiation is commonly employed in neutron scattering facilities, where the radiation is used in a similar way one uses X-rays for the analysis of condensed matter. Neutrons are complementary to the latter in terms of atomic contrasts by different scattering cross sections; sensitivity to magnetism; energy range for inelastic neutron spectroscopy; and deep penetration into matter. The development of "neutron lenses" based on total internal reflection within hollow glass capillary tubes or by reflection from dimpled aluminum plates has driven ongoing research into neutron microscopy and neutron/gamma ray tomography. A major use of neutrons is to excite delayed and prompt gamma rays from elements in materials. This forms the basis of neutron activation analysis (NAA) and prompt gamma neutron activation analysis (PGNAA). NAA is most often used to analyze small samples of materials in a nuclear reactor whilst PGNAA is most often used to analyze subterranean rocks around bore holes and industrial bulk materials on conveyor belts. Another use of neutron emitters is the detection of light nuclei, in particular the hydrogen found in water molecules. When a fast neutron collides with a light nucleus, it loses a large fraction of its energy. By measuring the rate at which slow neutrons return to the probe after reflecting off of hydrogen nuclei, a neutron probe may determine the water content in soil. Medical therapies Because neutron radiation is both penetrating and ionizing, it can be exploited for medical treatments. However, neutron radiation can have the unfortunate side-effect of leaving the affected area radioactive. Neutron tomography is therefore not a viable medical application. Fast neutron therapy uses high-energy neutrons typically greater than 20 MeV to treat cancer. Radiation therapy of cancers is based upon the biological response of cells to ionizing radiation. If radiation is delivered in small sessions to damage cancerous areas, normal tissue will have time to repair itself, while tumor cells often cannot. Neutron radiation can deliver energy to a cancerous region at a rate an order of magnitude larger than gamma radiation. Beams of low-energy neutrons are used in boron neutron capture therapy to treat cancer. In boron neutron capture therapy, the patient is given a drug that contains boron and that preferentially accumulates in the tumor to be targeted. The tumor is then bombarded with very low-energy neutrons (although often higher than thermal energy) which are captured by the boron-10 isotope in the boron, which produces an excited state of boron-11 that then decays to produce lithium-7 and an alpha particle that have sufficient energy to kill the malignant cell, but insufficient range to damage nearby cells. For such a therapy to be applied to the treatment of cancer, a neutron source having an intensity of the order of a thousand million (109) neutrons per second per cm2 is preferred. Such fluxes require a research nuclear reactor. Protection Exposure to free neutrons can be hazardous, since the interaction of neutrons with molecules in the body can cause disruption to molecules and atoms, and can also cause reactions that give rise to other forms of radiation (such as protons). The normal precautions of radiation protection apply: Avoid exposure, stay as far from the source as possible, and keep exposure time to a minimum. But particular thought must be given to how to protect from neutron exposure. For other types of radiation, e.g., alpha particles, beta particles, or gamma rays, material of a high atomic number and with high density makes for good shielding; frequently, lead is used. However, this approach will not work with neutrons, since the absorption of neutrons does not increase straightforwardly with atomic number, as it does with alpha, beta, and gamma radiation. Instead one needs to look at the particular interactions neutrons have with matter (see the section on detection above). For example, hydrogen-rich materials are often used to shield against neutrons, since ordinary hydrogen both scatters and slows neutrons. This often means that simple concrete blocks or even paraffin-loaded plastic blocks afford better protection from neutrons than do far more dense materials. After slowing, neutrons may then be absorbed with an isotope that has high affinity for slow neutrons without causing secondary capture radiation, such as lithium-6. Hydrogen-rich ordinary water affects neutron absorption in nuclear fission reactors: Usually, neutrons are so strongly absorbed by normal water that fuel enrichment with fissionable isotope is required. The deuterium in heavy water has a very much lower absorption affinity for neutrons than does protium (normal light hydrogen). Deuterium is, therefore, used in CANDU-type reactors, in order to slow (moderate) neutron velocity, to increase the probability of nuclear fission compared to neutron capture. Neutron temperature Thermal neutrons Thermal neutrons are free neutrons whose energies have a Maxwell–Boltzmann distribution with kT = () at room temperature. This gives characteristic (not average, or median) speed of 2.2 km/s. The name 'thermal' comes from their energy being that of the room temperature gas or material they are permeating. (see kinetic theory for energies and speeds of molecules). After a number of collisions (often in the range of 10–20) with nuclei, neutrons arrive at this energy level, provided that they are not absorbed. In many substances, thermal neutron reactions show a much larger effective cross-section than reactions involving faster neutrons, and thermal neutrons can therefore be absorbed more readily (i.e., with higher probability) by any atomic nuclei that they collide with, creating a heavier – and often unstable – isotope of the chemical element as a result. Most fission reactors use a neutron moderator to slow down, or thermalize the neutrons that are emitted by nuclear fission so that they are more easily captured, causing further fission. Others, called fast breeder reactors, use fission energy neutrons directly. Cold neutrons Cold neutrons are thermal neutrons that have been equilibrated in a very cold substance such as liquid deuterium. Such a cold source is placed in the moderator of a research reactor or spallation source. Cold neutrons are particularly valuable for neutron scattering experiments. Ultracold neutrons Ultracold neutrons are produced by inelastic scattering of cold neutrons in substances with a low neutron absorption cross section at a temperature of a few kelvins, such as solid deuterium or superfluid helium. An alternative production method is the mechanical deceleration of cold |
radiating advertisements and "electro-graphic architecture". Neon played a role in the basic understanding of the nature of atoms in 1913, when J. J. Thomson, as part of his exploration into the composition of canal rays, channeled streams of neon ions through a magnetic and an electric field and measured the deflection of the streams with a photographic plate. Thomson observed two separate patches of light on the photographic plate (see image), which suggested two different parabolas of deflection. Thomson eventually concluded that some of the atoms in the neon gas were of higher mass than the rest. Though not understood at the time by Thomson, this was the first discovery of isotopes of stable atoms. Thomson's device was a crude version of the instrument we now term a mass spectrometer. Isotopes Neon is the second lightest inert gas. Neon has three stable isotopes: 20Ne (90.48%), 21Ne (0.27%) and 22Ne (9.25%). 21Ne and 22Ne are partly primordial and partly nucleogenic (i.e. made by nuclear reactions of other nuclides with neutrons or other particles in the environment) and their variations in natural abundance are well understood. In contrast, 20Ne (the chief primordial isotope made in stellar nucleosynthesis) is not known to be nucleogenic or radiogenic. The causes of the variation of 20Ne in the Earth have thus been hotly debated. The principal nuclear reactions generating nucleogenic neon isotopes start from 24Mg and 25Mg, which produce 21Ne and 22Ne respectively, after neutron capture and immediate emission of an alpha particle. The neutrons that produce the reactions are mostly produced by secondary spallation reactions from alpha particles, in turn derived from uranium-series decay chains. The net result yields a trend towards lower 20Ne/22Ne and higher 21Ne/22Ne ratios observed in uranium-rich rocks such as granites. 21Ne may also be produced in a nucleogenic reaction, when 20Ne absorbs a neutron from various natural terrestrial neutron sources. In addition, isotopic analysis of exposed terrestrial rocks has demonstrated the cosmogenic (cosmic ray) production of 21Ne. This isotope is generated by spallation reactions on magnesium, sodium, silicon, and aluminium. By analyzing all three isotopes, the cosmogenic component can be resolved from magmatic neon and nucleogenic neon. This suggests that neon will be a useful tool in determining cosmic exposure ages of surface rocks and meteorites. Similar to xenon, neon content observed in samples of volcanic gases is enriched in 20Ne and nucleogenic 21Ne relative to 22Ne content. The neon isotopic content of these mantle-derived samples represents a non-atmospheric source of neon. The 20Ne-enriched components are attributed to exotic primordial rare-gas components in the Earth, possibly representing solar neon. Elevated 20Ne abundances are found in diamonds, further suggesting a solar-neon reservoir in the Earth. Characteristics Neon is the second-lightest noble gas, after helium. It glows reddish-orange in a vacuum discharge tube. Also, neon has the narrowest liquid range of any element: from 24.55 to 27.05 K (−248.45 °C to −245.95 °C, or −415.21 °F to −410.71 °F). It has over 40 times the refrigerating capacity (per unit volume) of liquid helium and three times that of liquid hydrogen. In most applications it is a less expensive refrigerant than helium. Neon plasma has the most intense light discharge at normal voltages and currents of all the noble gases. The average color of this light to the human eye is red-orange due to many lines in this range; it also contains a strong green line, which is hidden, unless the visual components are dispersed by a spectroscope. Two quite different kinds of neon lighting are in common use. Neon glow lamps are generally tiny, with most operating between 100 and 250 volts. They have been widely used as power-on indicators and in circuit-testing equipment, but light-emitting diodes (LEDs) now dominate in those applications. These simple neon devices were the forerunners of plasma displays and plasma television screens. Neon signs typically operate at much higher voltages (2–15 kilovolts), and the luminous tubes are commonly meters long. The glass tubing is often formed into shapes and letters for signage, as well as architectural and artistic applications. Occurrence Stable isotopes of neon are produced in stars. Neon's most abundant isotope 20Ne (90.48%) is created by the nuclear fusion of carbon and carbon in the carbon-burning process of stellar nucleosynthesis. This requires temperatures above 500 megakelvins, which occur in the cores of stars of more than 8 solar masses. Neon is abundant on a universal scale; it is the fifth most abundant chemical element in the universe by mass, after hydrogen, helium, oxygen, and carbon (see chemical element). Its relative rarity on Earth, like that of helium, is due to its relative lightness, high vapor pressure at very low temperatures, and chemical inertness, all properties which tend to keep it from being trapped in the condensing gas and dust clouds that formed the smaller and warmer solid planets like Earth. Neon is monatomic, making it lighter than the molecules of diatomic nitrogen and oxygen which form the bulk of Earth's atmosphere; a balloon filled with neon will rise in air, albeit more slowly than a helium balloon. Neon's abundance in the universe is about 1 part in 750; in the Sun and presumably in the proto-solar system nebula, about 1 part in 600. The Galileo spacecraft atmospheric entry probe found that even in the upper atmosphere of Jupiter, the abundance of neon is reduced (depleted) by about a factor of 10, to a level of 1 part in 6,000 by mass. This may indicate that even the ice-planetesimals, which brought neon into Jupiter from the outer solar system, formed in a region which was too warm to retain the neon atmospheric component (abundances of heavier inert gases on Jupiter are several times that found in the Sun). Neon comprises 1 part in 55,000 in the Earth's atmosphere, or 18.2 ppm by volume (this is about the same as the molecule or mole fraction), or | made neon advertising completely different from the competition. The intense color and vibrancy of neon equated with American society at the time, suggesting a "century of progress" and transforming cities into sensational new environments filled with radiating advertisements and "electro-graphic architecture". Neon played a role in the basic understanding of the nature of atoms in 1913, when J. J. Thomson, as part of his exploration into the composition of canal rays, channeled streams of neon ions through a magnetic and an electric field and measured the deflection of the streams with a photographic plate. Thomson observed two separate patches of light on the photographic plate (see image), which suggested two different parabolas of deflection. Thomson eventually concluded that some of the atoms in the neon gas were of higher mass than the rest. Though not understood at the time by Thomson, this was the first discovery of isotopes of stable atoms. Thomson's device was a crude version of the instrument we now term a mass spectrometer. Isotopes Neon is the second lightest inert gas. Neon has three stable isotopes: 20Ne (90.48%), 21Ne (0.27%) and 22Ne (9.25%). 21Ne and 22Ne are partly primordial and partly nucleogenic (i.e. made by nuclear reactions of other nuclides with neutrons or other particles in the environment) and their variations in natural abundance are well understood. In contrast, 20Ne (the chief primordial isotope made in stellar nucleosynthesis) is not known to be nucleogenic or radiogenic. The causes of the variation of 20Ne in the Earth have thus been hotly debated. The principal nuclear reactions generating nucleogenic neon isotopes start from 24Mg and 25Mg, which produce 21Ne and 22Ne respectively, after neutron capture and immediate emission of an alpha particle. The neutrons that produce the reactions are mostly produced by secondary spallation reactions from alpha particles, in turn derived from uranium-series decay chains. The net result yields a trend towards lower 20Ne/22Ne and higher 21Ne/22Ne ratios observed in uranium-rich rocks such as granites. 21Ne may also be produced in a nucleogenic reaction, when 20Ne absorbs a neutron from various natural terrestrial neutron sources. In addition, isotopic analysis of exposed terrestrial rocks has demonstrated the cosmogenic (cosmic ray) production of 21Ne. This isotope is generated by spallation reactions on magnesium, sodium, silicon, and aluminium. By analyzing all three isotopes, the cosmogenic component can be resolved from magmatic neon and nucleogenic neon. This suggests that neon will be a useful tool in determining cosmic exposure ages of surface rocks and meteorites. Similar to xenon, neon content observed in samples of volcanic gases is enriched in 20Ne and nucleogenic 21Ne relative to 22Ne content. The neon isotopic content of these mantle-derived samples represents a non-atmospheric source of neon. The 20Ne-enriched components are attributed to exotic primordial rare-gas components in the Earth, possibly representing solar neon. Elevated 20Ne abundances are found in diamonds, further suggesting a solar-neon reservoir in the Earth. Characteristics Neon is the second-lightest noble gas, after helium. It glows reddish-orange in a vacuum discharge tube. Also, neon has the narrowest liquid range of any element: from 24.55 to 27.05 K (−248.45 °C to −245.95 °C, or −415.21 °F to −410.71 °F). It has over 40 times the refrigerating capacity (per unit volume) of liquid helium and three times that of liquid hydrogen. In most applications it is a less expensive refrigerant than helium. Neon plasma has the most intense light discharge at normal voltages and currents of all the noble gases. The average color of this light to the human eye is red-orange due to many lines in this range; it also contains a strong green line, which is hidden, unless the visual components are dispersed by a spectroscope. Two quite different kinds of neon lighting are in common use. Neon glow lamps are generally tiny, with most operating between 100 and 250 volts. They have been widely used as power-on indicators and in circuit-testing equipment, but light-emitting diodes (LEDs) now dominate in those applications. These simple neon devices were the forerunners of plasma displays and plasma television screens. Neon signs typically operate at much higher voltages (2–15 kilovolts), and the luminous tubes are commonly meters long. The glass tubing is often formed into shapes and letters for signage, as well as architectural and artistic applications. Occurrence Stable isotopes of neon are produced in stars. Neon's most abundant isotope 20Ne (90.48%) is created by the nuclear fusion of carbon and carbon in the carbon-burning process of stellar nucleosynthesis. This requires temperatures above 500 megakelvins, which occur in the cores of stars of more than 8 solar masses. Neon is abundant on a universal scale; it is the fifth most abundant chemical element in the universe by mass, after hydrogen, helium, oxygen, and carbon (see chemical element). Its relative rarity on Earth, like that of helium, is due to its relative lightness, high vapor pressure at very low temperatures, and chemical inertness, all properties which tend to keep it from being trapped in the condensing gas and dust clouds that formed the smaller and warmer solid planets like Earth. Neon is monatomic, making it lighter than the molecules of diatomic nitrogen and oxygen which form the bulk of Earth's atmosphere; a balloon filled with neon will rise in air, albeit more slowly than a helium balloon. Neon's abundance in the universe is about 1 part in 750; in the Sun and presumably in the proto-solar system nebula, about 1 part in 600. The Galileo spacecraft atmospheric entry probe found that even in the upper atmosphere of Jupiter, the abundance of neon is reduced (depleted) by about a factor of 10, to a level of 1 part in 6,000 by mass. This may indicate that even the ice-planetesimals, which brought neon into Jupiter from the outer solar system, formed in a region which was too warm to retain the neon atmospheric component (abundances of heavier inert gases on |
as is with 28 protons and 50 neutrons. Both are therefore unusually stable for nuclides with so large a proton–neutron imbalance. Nickel-63 is a contaminant found in the support structure of nuclear reactors. It is produced through neutron capture by nickel-62. Small amounts have also been found near nuclear weapon test sites in the South Pacific. Occurrence On Earth, nickel occurs most often in combination with sulfur and iron in pentlandite, with sulfur in millerite, with arsenic in the mineral nickeline, and with arsenic and sulfur in nickel galena. Nickel is commonly found in iron meteorites as the alloys kamacite and taenite. The presence of nickel in meteorites was first detected in 1799 by Joseph-Louis Proust, a French chemist who then worked in Spain. Proust analyzed samples of the meteorite from Campo del Cielo (Argentina), which had been obtained in 1783 by Miguel Rubín de Celis, discovering the presence in them of nickel (about 10%) along with iron. The bulk of the nickel is mined from two types of ore deposits. The first is laterite, where the principal ore mineral mixtures are nickeliferous limonite, (Fe,Ni)O(OH), and garnierite (a mixture of various hydrous nickel and nickel-rich silicates). The second is magmatic sulfide deposits, where the principal ore mineral is pentlandite: . Indonesia and Australia have the biggest estimated reserves, at 43.6% of world's total. Identified land-based resources throughout the world averaging 1% nickel or greater comprise at least 130 million tons of nickel (about the double of known reserves). About 60% is in laterites and 40% in sulfide deposits. On geophysical evidence, most of the nickel on Earth is believed to be in the Earth's outer and inner cores. Kamacite and taenite are naturally occurring alloys of iron and nickel. For kamacite, the alloy is usually in the proportion of 90:10 to 95:5, although impurities (such as cobalt or carbon) may be present, while for taenite the nickel content is between 20% and 65%. Kamacite and taenite are also found in nickel iron meteorites. Compounds The most common oxidation state of nickel is +2, but compounds of Ni0, Ni+, and Ni3+ are well known, and the exotic oxidation states Ni2−, Ni1−, and Ni4+ have been produced and studied. Nickel(0) Nickel tetracarbonyl ), discovered by Ludwig Mond, is a volatile, highly toxic liquid at room temperature. On heating, the complex decomposes back to nickel and carbon monoxide: Ni + 4 CO This behavior is exploited in the Mond process for purifying nickel, as described above. The related nickel(0) complex bis(cyclooctadiene)nickel(0) is a useful catalyst in organonickel chemistry because the cyclooctadiene (or cod) ligands are easily displaced. Nickel(I) Nickel(I) complexes are uncommon, but one example is the tetrahedral complex NiBr(PPh3)3. Many nickel(I) complexes feature Ni-Ni bonding, such as the dark red diamagnetic prepared by reduction of with sodium amalgam. This compound is oxidised in water, liberating . It is thought that the nickel(I) oxidation state is important to nickel-containing enzymes, such as [NiFe]-hydrogenase, which catalyzes the reversible reduction of protons to . Nickel(II) Nickel(II) forms compounds with all common anions, including sulfide, sulfate, carbonate, hydroxide, carboxylates, and halides. Nickel(II) sulfate is produced in large quantities by dissolving nickel metal or oxides in sulfuric acid, forming both a hexa- and heptahydrates useful for electroplating nickel. Common salts of nickel, such as chloride, nitrate, and sulfate, dissolve in water to give green solutions of the metal aquo complex . The four halides form nickel compounds, which are solids with molecules that feature octahedral Ni centres. Nickel(II) chloride is most common, and its behavior is illustrative of the other halides. Nickel(II) chloride is produced by dissolving nickel or its oxide in hydrochloric acid. It is usually encountered as the green hexahydrate, the formula of which is usually written NiCl2•6H2O. When dissolved in water, this salt forms the metal aquo complex . Dehydration of NiCl2•6H2O gives the yellow anhydrous . Some tetracoordinate nickel(II) complexes, e.g. bis(triphenylphosphine)nickel chloride, exist both in tetrahedral and square planar geometries. The tetrahedral complexes are paramagnetic, whereas the square planar complexes are diamagnetic. In having properties of magnetic equilibrium and formation of octahedral complexes, they contrast with the divalent complexes of the heavier group 10 metals, palladium(II) and platinum(II), which form only square-planar geometry. Nickelocene is known; it has an electron count of 20, making it relatively unstable. Nickel(III) and (IV) Numerous Ni(III) compounds are known, with the first such examples being Nickel(III) trihalophosphines (NiIII(PPh3)X3). Further, Ni(III) forms simple salts with fluoride or oxide ions. Ni(III) can be stabilized by σ-donor ligands such as thiols and organophosphines. Ni(IV) is present in the mixed oxide , while Ni(III) is present in nickel oxide hydroxide, which is used as the cathode in many rechargeable batteries, including nickel-cadmium, nickel-iron, nickel hydrogen, and nickel-metal hydride, and used by certain manufacturers in Li-ion batteries. Ni(IV) remains a rare oxidation state of nickel and very few compounds are known to date. History Because the ores of nickel are easily mistaken for ores of silver and copper, understanding of this metal and its use dates to relatively recent times. However, the unintentional use of nickel is ancient, and can be traced back as far as 3500 BCE. Bronzes from what is now Syria have been found to contain as much as 2% nickel. Some ancient Chinese manuscripts suggest that "white copper" (cupronickel, known as baitong) was used there between 1700 and 1400 BCE. This Paktong white copper was exported to Britain as early as the 17th century, but the nickel content of this alloy was not discovered until 1822. Coins of nickel-copper alloy were minted by the Bactrian kings Agathocles, Euthydemus II, and Pantaleon in the 2nd century BCE, possibly out of the Chinese cupronickel. In medieval Germany, a metallic yellow mineral was found in the Erzgebirge (Ore Mountains) that resembled copper ore. However, when miners were unable to extract any copper from it, they blamed a mischievous sprite of German mythology, Nickel (similar to Old Nick), for besetting the copper. They called this ore Kupfernickel from the German Kupfer for copper. This ore is now known as the mineral nickeline (formerly niccolite), a nickel arsenide. In 1751, Baron Axel Fredrik Cronstedt tried to extract copper from kupfernickel at a cobalt mine in the Swedish village of Los, and instead produced a white metal that he named nickel after the spirit that had given its name to the mineral. In modern German, Kupfernickel or Kupfer-Nickel designates the alloy cupronickel. Originally, the only source for nickel was the rare Kupfernickel. Beginning in 1824, nickel was obtained as a byproduct of cobalt blue production. The first large-scale smelting of nickel began in Norway in 1848 from nickel-rich pyrrhotite. The introduction of nickel in steel production in 1889 increased the demand for nickel, and the nickel deposits of New Caledonia, discovered in 1865, provided most of the world's supply between 1875 and 1915. The discovery of the large deposits in the Sudbury Basin, Canada in 1883, in Norilsk-Talnakh, Russia in 1920, and in the Merensky Reef, South Africa in 1924, made large-scale production of nickel possible. Coinage Aside from the aforementioned Bactrian coins, nickel was not a component of coins until the mid-19th century. Canada 99.9% nickel five-cent coins were struck in Canada (the world's largest nickel producer at the time) during non-war years from 1922 to 1981; the metal content made these coins magnetic. During the wartime period 1942–1945, most or all nickel was removed from Canadian and US coins to save it for manufacturing armor. Canada used 99.9% nickel from 1968 in its higher-value coins until 2000. Switzerland Coins of nearly pure nickel were first used in 1881 in Switzerland. United Kingdom Birmingham forged nickel coins in for trading in Malaysia. United States In the United States, the term "nickel" or "nick" originally applied to the copper-nickel Flying Eagle cent, which replaced copper with 12% nickel 1857–58, then the Indian Head cent of the same alloy from 1859 to 1864. Still later, in 1865, the term designated the three-cent nickel, with nickel increased to 25%. In 1866, the five-cent shield nickel (25% nickel, 75% copper) appropriated the designation. Along with the alloy proportion, this term has been used to the present in the United States. Current use In the 21st century, the high price of nickel has led to some replacement of the metal in coins around the world. Coins still made with nickel alloys include one- and two-euro coins, 5¢, 10¢, 25¢, 50¢, and $1 U.S. coins, and 20p, 50p, £1, and £2 UK coins. From 2012 on the nickel-alloy used for 5p and 10p UK coins was replaced with nickel-plated steel. This ignited a public controversy regarding the problems of people with nickel allergy. World production More than 2.5 million tonnes (t) of nickel per year are estimated to be mined worldwide, with Indonesia (760,000 t), the Philippines (320,000 t), Russia (280,000 t), New Caledonia (200,000 t), Australia (170,000 t) and Canada (150,000 t) being the largest producers as of 2020. The largest deposits of nickel in non-Russian Europe are located in Finland and Greece. Identified land-based resources averaging 1% nickel or greater contain at least 130 million tonnes of nickel. Approximately 60% is in laterites and 40% is in sulfide deposits. In addition, extensive nickel sources are found in the depths of the Pacific Ocean, particularly within an area called the Clarion Clipperton Zone in the form of polymetallic nodules peppering the seafloor at a depth of 3.5–6 km below sea level. These nodules are composed of numerous rare-earth metals and the nickel composition of these nodules is estimated to be 1.7%. With advances in modern science and engineering, regulation is currently being set in place by the International Seabed Authority to ensure that these nodules are collected in an environmentally conscientious manner while adhering to the United Nations Sustainable Development Goals. The one locality in the United States where nickel has been profitably mined is Riddle, Oregon, where several square miles of nickel-bearing garnierite surface deposits are located. The mine closed in 1987. The Eagle mine project is a new nickel mine in Michigan's upper peninsula. Construction was completed in 2013, and operations began in the third quarter of 2014. In the first full year of operation, the Eagle Mine produced 18,000 t. Production Nickel is obtained through extractive metallurgy: it is extracted from the ore by conventional roasting and reduction processes that yield a metal of greater than 75% purity. In many stainless steel applications, 75% pure nickel can be used without further purification, depending on the impurities. Traditionally, most sulfide ores have been processed using pyrometallurgical techniques to produce a matte for further refining. Recent advances in hydrometallurgical techniques resulted in significantly purer metallic nickel product. Most sulfide deposits have traditionally been processed by concentration through a froth flotation process followed by pyrometallurgical extraction. In hydrometallurgical processes, nickel sulfide ores are concentrated with flotation (differential flotation if Ni/Fe ratio is too low) and then smelted. The nickel matte is further processed with the Sherritt-Gordon process. First, copper is removed by adding hydrogen sulfide, leaving a concentrate of cobalt and nickel. Then, solvent extraction is used to separate the cobalt and nickel, with the final nickel content greater than 99%. Electrorefining A second common refining process is leaching the metal matte into a nickel salt solution, followed by the electrowinning of the nickel from solution by plating it onto a cathode as electrolytic nickel. Mond process The purest metal is obtained from nickel oxide by the Mond process, which achieves a purity of greater than 99.99%. The process was patented by Ludwig Mond and has been in industrial use since before the beginning of the 20th century. In this process, nickel is reacted with carbon monoxide in the presence of a sulfur catalyst at around 40–80 °C to form nickel carbonyl. Iron gives iron pentacarbonyl, too, but this reaction is slow. If necessary, the nickel may be separated by distillation. Dicobalt octacarbonyl is also formed in nickel distillation as a by-product, but it decomposes to tetracobalt dodecacarbonyl at the reaction temperature to give a non-volatile solid. Nickel is obtained from nickel carbonyl by one of two processes. It may be passed through a large chamber at high temperatures in which tens of thousands of nickel spheres, called pellets, are constantly stirred. The carbonyl decomposes and deposits pure nickel onto the nickel spheres. In the alternate process, nickel carbonyl is decomposed in a smaller chamber at 230 °C to create a fine nickel powder. The byproduct carbon monoxide is recirculated and reused. The highly pure nickel product is known as "carbonyl nickel". Metal value The market price of nickel surged throughout 2006 and the early months of 2007; as of April 5, 2007, the metal was trading at US$52,300/tonne or $1.47/oz. The price subsequently fell dramatically, and as of September 2017, the metal was trading at $11,000/tonne, or $0.31/oz. The US nickel coin contains of nickel, which at the April 2007 price was worth 6.5 cents, along with 3.75 grams of copper worth about 3 cents, with a total metal value of more than 9 cents. Since the face value of a nickel is 5 cents, this made it an attractive target for melting by people wanting to sell the metals at a profit. However, the United States Mint, in anticipation of this practice, implemented new interim rules on December 14, 2006, subject to public comment for 30 days, which criminalized the melting and export of cents and nickels. Violators can be punished with a fine of up to $10,000 and/or imprisoned for a maximum of five years. As of September 19, 2013, the melt value of a US nickel (copper and nickel included) is $0.045, which is 90% of the face value. Applications The global production of nickel is presently used as follows: 68% in stainless steel; 10% in nonferrous alloys; 9% in electroplating; 7% in alloy steel; 3% in foundries; and 4% other uses (including batteries). Nickel is used in many specific and recognizable industrial and consumer products, including stainless steel, alnico magnets, coinage, rechargeable batteries, electric guitar strings, microphone capsules, plating on plumbing fixtures, and special alloys such as permalloy, elinvar, and invar. It is used for plating and as a green tint in glass. Nickel is preeminently an alloy metal, and its chief use is in nickel steels and nickel cast irons, in which it typically increases the tensile strength, toughness, and elastic limit. It is widely used in many other alloys, including nickel brasses and bronzes and alloys with copper, chromium, aluminium, lead, cobalt, silver, and gold (Inconel, Incoloy, Monel, Nimonic). Because it is resistant to corrosion, nickel was occasionally used as a substitute for decorative silver. Nickel was also occasionally used in some countries after 1859 as a cheap coinage metal (see above), but in the later years of the 20th century, it was replaced by cheaper stainless steel (i.e. iron) alloys, except in the United States and Canada. Nickel is an excellent alloying agent for certain precious metals and | examples are Norilsk in Russia and the Sudbury Basin in Canada. Nickel foam or nickel mesh is used in gas diffusion electrodes for alkaline fuel cells. Nickel and its alloys are frequently used as catalysts for hydrogenation reactions. Raney nickel, a finely divided nickel-aluminium alloy, is one common form, though related catalysts are also used, including Raney-type catalysts. Nickel is a naturally magnetostrictive material, meaning that, in the presence of a magnetic field, the material undergoes a small change in length. The magnetostriction of nickel is on the order of 50 ppm and is negative, indicating that it contracts. Nickel is used as a binder in the cemented tungsten carbide or hardmetal industry and used in proportions of 6% to 12% by weight. Nickel makes the tungsten carbide magnetic and adds corrosion-resistance to the cemented parts, although the hardness is less than those with a cobalt binder. , with its half-life of 100.1 years, is useful in krytron devices as a beta particle (high-speed electron) emitter to make ionization by the keep-alive electrode more reliable. It is being investigated as a power source for betavoltaic batteries. Around 27% of all nickel production is destined for engineering, 10% for building and construction, 14% for tubular products, 20% for metal goods, 14% for transport, 11% for electronic goods, and 5% for other uses. Raney nickel is widely used for hydrogenation of unsaturated oils to make margarine, and substandard margarine and leftover oil may contain nickel as contaminant. Forte et al. found that type 2 diabetic patients have 0.89 ng/ml of Ni in the blood relative to 0.77 ng/ml in the control subjects. Biological role Although it was not recognized until the 1970s, nickel is known to play an important role in the biology of some plants, eubacteria, archaebacteria, and fungi. Nickel enzymes such as urease are considered virulence factors in some organisms. Urease catalyzes the hydrolysis of urea to form ammonia and carbamate. The NiFe hydrogenases can catalyze the oxidation of to form protons and electrons, and can also catalyze the reverse reaction, the reduction of protons to form hydrogen gas. A nickel-tetrapyrrole coenzyme, cofactor F430, is present in methyl coenzyme M reductase, which can catalyze the formation of methane, or the reverse reaction, in methanogenic archaea (in +1 oxidation state). One of the carbon monoxide dehydrogenase enzymes consists of an Fe-Ni-S cluster. Other nickel-bearing enzymes include a rare bacterial class of superoxide dismutase and glyoxalase I enzymes in bacteria and several parasitic eukaryotic trypanosomal parasites (in higher organisms, including yeast and mammals, this enzyme contains divalent Zn2+). Dietary nickel may affect human health through infections by nickel-dependent bacteria, but it is also possible that nickel is an essential nutrient for bacteria residing in the large intestine, in effect functioning as a prebiotic. The US Institute of Medicine has not confirmed that nickel is an essential nutrient for humans, so neither a Recommended Dietary Allowance (RDA) nor an Adequate Intake have been established. The Tolerable Upper Intake Level of dietary nickel is 1000 µg/day as soluble nickel salts. Dietary intake is estimated at 70 to 100 µg/day, with less than 10% absorbed. What is absorbed is excreted in urine. Relatively large amounts of nickel – comparable to the estimated average ingestion above – leach into food cooked in stainless steel. For example, the amount of nickel leached after 10 cooking cycles into one serving of tomato sauce averages 88 µg. Nickel released from Siberian Traps volcanic eruptions is suspected of assisting the growth of Methanosarcina, a genus of euryarchaeote archaea that produced methane during the Permian–Triassic extinction event, the biggest extinction event on record. Toxicity The major source of nickel exposure is oral consumption, as nickel is essential to plants. Nickel is found naturally in the environment: Typical background concentrations do not exceed 20 ng/m3 in the atmosphere; 100 mg/kg in soil; 10 mg/kg in vegetation; 10 μg/L in freshwater and 1 μg/L in seawater. Environmental concentrations of nickel may be increased by human pollution. For example, nickel-plated faucets may contaminate water and soil; mining and smelting may dump nickel into waste-water; nickel–steel alloy cookware and nickel-pigmented dishes may release nickel into food. The atmosphere may be polluted by nickel ore refining and fossil fuel combustion. Humans may absorb nickel directly from tobacco smoke and skin contact with jewelry, shampoos, detergents, and coins. A less-common form of chronic exposure is through hemodialysis as traces of nickel ions may be absorbed into the plasma from the chelating action of albumin. The average daily exposure does not pose a threat to human health. Most of the nickel absorbed every day by humans is removed by the kidneys and passed out of the body through urine or is eliminated through the gastrointestinal tract without being absorbed. Nickel is not a cumulative poison, but larger doses or chronic inhalation exposure may be toxic, even carcinogenic, and constitute an occupational hazard. Nickel compounds are classified as human carcinogens based on increased respiratory cancer risks observed in epidemiological studies of sulfidic ore refinery workers. This is supported by the positive results of the NTP bioassays with Ni sub-sulfide and Ni oxide in rats and mice. The human and animal data consistently indicate a lack of carcinogenicity via the oral route of exposure and limit the carcinogenicity of nickel compounds to respiratory tumours after inhalation. Nickel metal is classified as a suspect carcinogen; there is consistency between the absence of increased respiratory cancer risks in workers predominantly exposed to metallic nickel and the lack of respiratory tumours in a rat lifetime inhalation carcinogenicity study with nickel metal powder. In the rodent inhalation studies with various nickel compounds and nickel metal, increased lung inflammations with and without bronchial lymph node hyperplasia or fibrosis were observed. In rat studies, oral ingestion of water-soluble nickel salts can trigger perinatal mortality effects in pregnant animals. Whether these effects are relevant to humans is unclear as epidemiological studies of highly exposed female workers have not shown adverse developmental toxicity effects. People can be exposed to nickel in the workplace by inhalation, ingestion, and contact with skin or eye. The Occupational Safety and Health Administration (OSHA) has set the legal limit (permissible exposure limit) for the workplace at 1 mg/m3 per 8-hour workday, excluding nickel carbonyl. The National Institute for Occupational Safety and Health (NIOSH) specifies the recommended exposure limit (REL) of 0.015 mg/m3 per 8-hour workday. At 10 mg/m3, nickel is immediately dangerous to life and health. Nickel carbonyl [] is an extremely toxic gas. The toxicity of metal carbonyls is a function of both the toxicity of the metal and the off-gassing of carbon monoxide from the carbonyl functional groups; nickel carbonyl is also explosive in air. Sensitized individuals may show a skin contact allergy to nickel known as a contact dermatitis. Highly sensitized individuals may also react to foods with high nickel content. Sensitivity to nickel may also be present in patients with pompholyx. Nickel is the top confirmed contact allergen worldwide, partly due to its use in jewelry for pierced ears. Nickel allergies affecting pierced ears are often marked by itchy, red skin. Many earrings are now made without nickel or with low-release nickel to address this problem. The amount allowed in products that contact human skin is now regulated by the European Union. In 2002, researchers found that the nickel released by 1 and 2 Euro coins was far in excess of those standards. This is believed to be the result of a galvanic reaction. Nickel was voted Allergen of the Year in 2008 by the American Contact Dermatitis Society. In August 2015, the American Academy of Dermatology adopted a position statement on the safety of nickel: "Estimates suggest that contact dermatitis, which includes nickel sensitization, accounts for approximately $1.918 billion and affects nearly 72.29 million people." Reports show that both |
Tantalus: niobium (from Niobe) and pelopium (from Pelops). This confusion arose from the minimal observed differences between tantalum and niobium. The claimed new elements pelopium, ilmenium, and dianium were in fact identical to niobium or mixtures of niobium and tantalum. The differences between tantalum and niobium were unequivocally demonstrated in 1864 by Christian Wilhelm Blomstrand and Henri Étienne Sainte-Claire Deville, as well as Louis J. Troost, who determined the formulas of some of the compounds in 1865 and finally by Swiss chemist Jean Charles Galissard de Marignac in 1866, who all proved that there were only two elements. Articles on ilmenium continued to appear until 1871. De Marignac was the first to prepare the metal in 1864, when he reduced niobium chloride by heating it in an atmosphere of hydrogen. Although de Marignac was able to produce tantalum-free niobium on a larger scale by 1866, it was not until the early 20th century that niobium was used in incandescent lamp filaments, the first commercial application. This use quickly became obsolete through the replacement of niobium with tungsten, which has a higher melting point. That niobium improves the strength of steel was first discovered in the 1920s, and this application remains its predominant use. In 1961, the American physicist Eugene Kunzler and coworkers at Bell Labs discovered that niobium–tin continues to exhibit superconductivity in the presence of strong electric currents and magnetic fields, making it the first material to support the high currents and fields necessary for useful high-power magnets and electrical power machinery. This discovery enabled—two decades later—the production of long multi-strand cables wound into coils to create large, powerful electromagnets for rotating machinery, particle accelerators, and particle detectors. Naming the element Columbium (symbol "Cb") was the name originally bestowed by Hatchett upon his discovery of the metal in 1801. The name reflected that the type specimen of the ore came from America (Columbia). This name remained in use in American journals—the last paper published by American Chemical Society with columbium in its title dates from 1953—while niobium was used in Europe. To end this confusion, the name niobium was chosen for element 41 at the 15th Conference of the Union of Chemistry in Amsterdam in 1949. A year later this name was officially adopted by the International Union of Pure and Applied Chemistry (IUPAC) after 100 years of controversy, despite the chronological precedence of the name columbium. This was a compromise of sorts; the IUPAC accepted tungsten instead of wolfram in deference to North American usage; and niobium instead of columbium in deference to European usage. While many US chemical societies and government organizations typically use the official IUPAC name, some metallurgists and metal societies still use the original American name, "columbium. Characteristics Physical Niobium is a lustrous, grey, ductile, paramagnetic metal in group 5 of the periodic table (see table), with an electron configuration in the outermost shells atypical for group 5. (This can be observed in the neighborhood of ruthenium (44), rhodium (45), and palladium (46).) Although it is thought to have a body-centered cubic crystal structure from absolute zero to its melting point, high-resolution measurements of the thermal expansion along the three crystallographic axes reveal anisotropies which are inconsistent with a cubic structure. Therefore, further research and discovery in this area is expected. Niobium becomes a superconductor at cryogenic temperatures. At atmospheric pressure, it has the highest critical temperature of the elemental superconductors at 9.2 K. Niobium has the greatest magnetic penetration depth of any element. In addition, it is one of the three elemental Type II superconductors, along with vanadium and technetium. The superconductive properties are strongly dependent on the purity of the niobium metal. When very pure, it is comparatively soft and ductile, but impurities make it harder. The metal has a low capture cross-section for thermal neutrons; thus it is used in the nuclear industries where neutron transparent structures are desired. Chemical The metal takes on a bluish tinge when exposed to air at room temperature for extended periods. Despite a high melting point in elemental form (2,468 °C), it has a lower density than other refractory metals. Furthermore, it is corrosion-resistant, exhibits superconductivity properties, and forms dielectric oxide layers. Niobium is slightly less electropositive and more compact than its predecessor in the periodic table, zirconium, whereas it is virtually identical in size to the heavier tantalum atoms, as a result of the lanthanide contraction. As a result, niobium's chemical properties are very similar to those for tantalum, which appears directly below niobium in the periodic table. Although its corrosion resistance is not as outstanding as that of tantalum, the lower price and greater availability make niobium attractive for less demanding applications, such as vat linings in chemical plants. Isotopes Niobium in the Earth's crust comprises one stable isotope, 93Nb. By 2003, at least 32 radioisotopes had been synthesized, ranging in atomic mass from 81 to 113. The most stable of these is 92Nb with a half-life of 34.7 million years. One of the least stable is 113Nb, with an estimated half-life of 30 milliseconds. Isotopes that are lighter than the stable 93Nb tend to decay by β+ decay, and those that are heavier tend to decay by β− decay, with some exceptions. 81Nb, 82Nb, and 84Nb have minor β+-delayed proton emission decay paths, 91Nb decays by electron capture and positron emission, and 92Nb decays by both β+ and β− decay. At least 25 nuclear isomers have been described, ranging in atomic mass from 84 to 104. Within this range, only 96Nb, 101Nb, and 103Nb do not have isomers. The most stable of niobium's isomers is 93mNb with a half-life of 16.13 years. The least stable isomer is 84mNb with a half-life of 103 ns. All of niobium's isomers decay by isomeric transition or beta decay except 92m1Nb, which has a minor electron capture branch. Occurrence Niobium is estimated to be the 34th-most common element in the Earth's crust, with 20 ppm. Some think that the abundance on Earth is much greater, and that the element's high density has concentrated it in the Earth's core. The free element is not found in nature, but niobium occurs in combination with other elements in minerals. Minerals that contain niobium often also contain tantalum. Examples include columbite () and columbite–tantalite (or coltan, ). Columbite–tantalite minerals (the most common species being columbite-(Fe) and tantalite-(Fe), where "-(Fe)" is the Levinson suffix informing about the prevailence of iron over other elements like manganese) are most usually found as accessory minerals in pegmatite intrusions, and in alkaline intrusive rocks. Less common are the niobates of calcium, uranium, thorium and the rare earth elements. Examples of such niobates are pyrochlore () (now a group name, with a relatively common example being, e.g., fluorcalciopyrochlore) and euxenite (correctly named euxenite-(Y)) (). These large deposits of niobium have been found associated with carbonatites (carbonate-silicate igneous rocks) and as a constituent of pyrochlore. The three largest currently mined deposits of pyrochlore, two in Brazil and one in Canada, were found in the 1950s, and are still the major producers of niobium mineral concentrates. The largest deposit is hosted within a carbonatite intrusion in Araxá, state of Minas Gerais, Brazil, owned by CBMM (Companhia Brasileira de Metalurgia e Mineração); the other active Brazilian deposit is located near Catalão, state of Goiás, and owned by China Molybdenum, also hosted within a carbonatite intrusion. Together, those two mines produce about 88% of the world's supply. Brazil also has a large but still unexploited deposit near São Gabriel da Cachoeira, state of Amazonas, as well as a few smaller deposits, notably in the state of Roraima. The third largest producer of niobium is the carbonatite-hosted Niobec mine, in Saint-Honoré, near Chicoutimi, Quebec, Canada, owned by Magris Resources. It produces between 7% and | De Marignac was the first to prepare the metal in 1864, when he reduced niobium chloride by heating it in an atmosphere of hydrogen. Although de Marignac was able to produce tantalum-free niobium on a larger scale by 1866, it was not until the early 20th century that niobium was used in incandescent lamp filaments, the first commercial application. This use quickly became obsolete through the replacement of niobium with tungsten, which has a higher melting point. That niobium improves the strength of steel was first discovered in the 1920s, and this application remains its predominant use. In 1961, the American physicist Eugene Kunzler and coworkers at Bell Labs discovered that niobium–tin continues to exhibit superconductivity in the presence of strong electric currents and magnetic fields, making it the first material to support the high currents and fields necessary for useful high-power magnets and electrical power machinery. This discovery enabled—two decades later—the production of long multi-strand cables wound into coils to create large, powerful electromagnets for rotating machinery, particle accelerators, and particle detectors. Naming the element Columbium (symbol "Cb") was the name originally bestowed by Hatchett upon his discovery of the metal in 1801. The name reflected that the type specimen of the ore came from America (Columbia). This name remained in use in American journals—the last paper published by American Chemical Society with columbium in its title dates from 1953—while niobium was used in Europe. To end this confusion, the name niobium was chosen for element 41 at the 15th Conference of the Union of Chemistry in Amsterdam in 1949. A year later this name was officially adopted by the International Union of Pure and Applied Chemistry (IUPAC) after 100 years of controversy, despite the chronological precedence of the name columbium. This was a compromise of sorts; the IUPAC accepted tungsten instead of wolfram in deference to North American usage; and niobium instead of columbium in deference to European usage. While many US chemical societies and government organizations typically use the official IUPAC name, some metallurgists and metal societies still use the original American name, "columbium. Characteristics Physical Niobium is a lustrous, grey, ductile, paramagnetic metal in group 5 of the periodic table (see table), with an electron configuration in the outermost shells atypical for group 5. (This can be observed in the neighborhood of ruthenium (44), rhodium (45), and palladium (46).) Although it is thought to have a body-centered cubic crystal structure from absolute zero to its melting point, high-resolution measurements of the thermal expansion along the three crystallographic axes reveal anisotropies which are inconsistent with a cubic structure. Therefore, further research and discovery in this area is expected. Niobium becomes a superconductor at cryogenic temperatures. At atmospheric pressure, it has the highest critical temperature of the elemental superconductors at 9.2 K. Niobium has the greatest magnetic penetration depth of any element. In addition, it is one of the three elemental Type II superconductors, along with vanadium and technetium. The superconductive properties are strongly dependent on the purity of the niobium metal. When very pure, it is comparatively soft and ductile, but impurities make it harder. The metal has a low capture cross-section for thermal neutrons; thus it is used in the nuclear industries where neutron transparent structures are desired. Chemical The metal takes on a bluish tinge when exposed to air at room temperature for extended periods. Despite a high melting point in elemental form (2,468 °C), it has a lower density than other refractory metals. Furthermore, it is corrosion-resistant, exhibits superconductivity properties, and forms dielectric oxide layers. Niobium is slightly less electropositive and more compact than its predecessor in the periodic table, zirconium, whereas it is virtually identical in size to the heavier tantalum atoms, as a result of the lanthanide contraction. As a result, niobium's chemical properties are very similar to those for tantalum, which appears directly below niobium in the periodic table. Although its corrosion resistance is not as outstanding as that of tantalum, the lower price and greater availability make niobium attractive for less demanding applications, such as vat linings in chemical plants. Isotopes Niobium in the Earth's crust comprises one stable isotope, 93Nb. By 2003, at least 32 radioisotopes had been synthesized, ranging in atomic mass from 81 to 113. The most stable of these is 92Nb with a half-life of 34.7 million years. One of the least stable is 113Nb, with an estimated half-life of 30 milliseconds. Isotopes that are lighter than the stable 93Nb tend to decay by β+ decay, and those that are heavier tend to decay by β− decay, with some exceptions. 81Nb, 82Nb, and 84Nb have minor β+-delayed proton emission decay paths, 91Nb decays by electron capture and positron emission, and 92Nb decays by both β+ and β− decay. At least 25 nuclear isomers have been described, ranging in atomic mass from 84 to 104. Within this range, only 96Nb, 101Nb, and 103Nb do not have isomers. The most stable of niobium's isomers is 93mNb with a half-life of 16.13 years. The least stable isomer is 84mNb with a half-life of 103 ns. All of niobium's isomers decay by isomeric transition or beta decay except 92m1Nb, which has a minor electron capture branch. Occurrence Niobium is estimated to be the 34th-most common element in the Earth's crust, with 20 ppm. Some think that the abundance on Earth is much greater, and that the element's high density has concentrated it in the Earth's core. The free element is not found in nature, but niobium occurs in combination with other elements in minerals. Minerals that contain niobium often also contain tantalum. Examples include columbite () and columbite–tantalite (or coltan, ). Columbite–tantalite minerals (the most common species being columbite-(Fe) and tantalite-(Fe), where "-(Fe)" is the Levinson suffix informing about the prevailence of iron over other elements like manganese) are most usually found as accessory minerals in pegmatite intrusions, and in alkaline intrusive rocks. Less common are the niobates of calcium, uranium, thorium and the rare earth elements. Examples of such niobates are pyrochlore () (now a group name, with a relatively common example being, e.g., fluorcalciopyrochlore) and euxenite (correctly named euxenite-(Y)) (). These large deposits of niobium have been found associated with carbonatites (carbonate-silicate igneous rocks) and as a constituent of pyrochlore. The three largest currently mined deposits of pyrochlore, two in Brazil and one in Canada, were found in the 1950s, and are still the major producers of niobium mineral concentrates. The largest deposit is hosted within a carbonatite intrusion in Araxá, state of Minas Gerais, Brazil, owned by CBMM (Companhia Brasileira de Metalurgia e Mineração); the other active Brazilian deposit is located near Catalão, state of Goiás, and owned by China Molybdenum, also hosted within a carbonatite intrusion. Together, those two mines produce about 88% of the world's supply. Brazil also has a large but still unexploited deposit near São Gabriel da Cachoeira, state of Amazonas, as well as a few smaller deposits, notably in the state of Roraima. The third largest producer of niobium is the carbonatite-hosted Niobec mine, in Saint-Honoré, near Chicoutimi, Quebec, Canada, owned by Magris Resources. It produces between 7% and 10% of the world's supply. Production After the separation from the other minerals, the mixed oxides of tantalum Ta2O5 and niobium Nb2O5 are obtained. The first step in the processing is the reaction of the oxides with hydrofluoric acid: Ta2O5 + 14 HF → 2 H2[TaF7] + 5 H2O Nb2O5 + 10 HF → 2 H2[NbOF5] + 3 H2O The first industrial scale separation, developed by de Marignac, exploits the differing solubilities of the complex niobium and tantalum fluorides, dipotassium oxypentafluoroniobate monohydrate (K2[NbOF5]·H2O) and dipotassium heptafluorotantalate (K2[TaF7]) in water. Newer processes use the liquid extraction of the fluorides from aqueous solution by organic solvents like cyclohexanone. The complex niobium and tantalum fluorides are extracted separately from the organic solvent with water and either precipitated by the addition of potassium fluoride to produce a potassium fluoride complex, or precipitated with ammonia as the pentoxide: H2[NbOF5] + 2 KF → |
compounds being the most strongly colored for the trivalent lanthanides, it can occasionally dominate the coloration of rare-earth minerals when competing chromophores are absent. It usually gives a pink coloration. Outstanding examples of this include monazite crystals from the tin deposits in Llallagua, Bolivia; ancylite from Mont Saint-Hilaire, Quebec, Canada; or lanthanite from the Saucon Valley, Pennsylvania, United States. As with neodymium glasses, such minerals change their colors under the differing lighting conditions. The absorption bands of neodymium interact with the visible emission spectrum of mercury vapor, with the unfiltered shortwave UV light causing neodymium-containing minerals to reflect a distinctive green color. This can be observed with monazite-containing sands or bastnäsite-containing ore. The demand for mineral resources, such as rare-earth elements (including neodymium) and other critical materials, has been rapidly increasing owing to the growing population and industrial development. Recently, the requirement for a low-carbon society has led to a significant demand for energy-saving technologies such as batteries, high-efficiency motors, renewable energy sources, and fuel cells. Among these technologies, permanent magnets are often used to fabricate high-efficiency motors, with neodymium-iron-boron magnets (Nd2Fe14B sintered and bonded magnets; hereinafter referred to as NdFeB magnets) being the main type of permanent magnet in the market since their invention. NdFeB magnets are used in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), electric vehicles (EVs), and fuel cell vehicles (FCVs) (hereinafter referred to as xEVs), wind turbines, home appliances, computers, and many small consumer electronic devices. Furthermore, they are indispensable for energy savings. Toward achieving the objectives of the Paris Agreement, the demand for NdFeB magnets is expected to increase significantly in the future. Applications Neodymium has an unusually large specific heat capacity at liquid-helium temperatures, so is useful in cryocoolers. Neodymium acetate can be a substitute for the radioactive and toxic uranyl acetate (used as a standard contrasting agent in electron microscopy). Probably because of similarities to Ca2+, Nd3+ has been reported to promote plant growth. Rare-earth element compounds are frequently used in China as fertilizer. Samarium–neodymium dating is useful for determining the age relationships of rocks. and meteorites Neodymium isotopes recorded in marine sediments are used to reconstruct changes in past ocean circulation. Magnets Neodymium magnets (actually an alloy, Nd2Fe14B) are the strongest permanent magnets known. A neodymium magnet of a few tens of grams can lift a thousand times its own weight, and can snap together with enough force to break bones. These magnets are cheaper, lighter, and stronger than samarium–cobalt magnets. However, they are not superior in every aspect, as neodymium-based magnets lose their magnetism at lower temperatures and tend to corrode, while samarium–cobalt magnets do not. Neodymium magnets appear in products such as microphones, professional loudspeakers, in-ear headphones, guitar and bass guitar pick-ups, and computer hard disks where low mass, small volume, or strong magnetic fields are required. Neodymium is used in the electric motors of hybrid and electric automobiles and in the electricity generators of some designs of commercial wind turbines (only wind turbines with "permanent magnet" generators use neodymium). For example, drive electric motors of each Toyota Prius require one kilogram (2.2 pounds) of neodymium per vehicle. In 2020, physics researchers at Radboud University and Uppsala University announced they had observed a behavior known as "self-induced spin glass" in the atomic structure of neodymium. One of the researchers explained, "…we are specialists in scanning tunneling microscopy. It allows us to see the structure of individual atoms, and we can resolve the north and south poles of the atoms. With this advancement in high-precision imaging, we were able to discover the behavior in neodymium, because we could resolve the incredibly small changes in the magnetic structure." Neodymium behaves in a complex magnetic way that had not been seen before in a periodic table element. Glass Neodymium glass (Nd:glass) is produced by the inclusion of neodymium oxide (Nd2O3) in the glass melt. Usually in daylight or incandescent light neodymium glass appears lavender, but it appears pale blue under fluorescent lighting. Neodymium may be used to color glass in delicate shades ranging from pure violet through wine-red and warm gray. The first commercial use of purified neodymium was in glass coloration, starting with experiments by Leo Moser in November 1927. The resulting "Alexandrite" glass remains a signature color of the Moser glassworks to this day. Neodymium glass was widely emulated in the early 1930s by American glasshouses, most notably Heisey, Fostoria ("wisteria"), Cambridge ("heatherbloom"), and Steuben ("wisteria"), and elsewhere (e.g. Lalique, in France, or Murano). Tiffin's "twilight" remained in production from about 1950 to 1980. Current sources include glassmakers in the Czech Republic, the United States, and China. The sharp absorption bands of neodymium cause the glass color to change under different lighting conditions, being reddish-purple under daylight or yellow incandescent light, but blue under white fluorescent lighting, or greenish under trichromatic lighting. This color-change phenomenon is highly prized by collectors. In combination with gold or selenium, red colors are produced. Since neodymium coloration depends upon "forbidden" f-f transitions deep within the atom, there is relatively little influence on the color from the chemical environment, so the color is impervious to the thermal history of the glass. However, for the best color, iron-containing impurities need to be minimized in the silica used to make the glass. The same forbidden nature of the f-f transitions makes rare-earth colorants less intense than those provided by most d-transition elements, so more has to be used in a glass to achieve the desired color intensity. The original Moser recipe used about 5% of neodymium oxide in the glass melt, a sufficient quantity such that Moser referred to these as being "rare-earth doped" glasses. Being a strong base, that level of neodymium would have affected the melting properties of the glass, and the lime content of the glass might have had to be adjusted accordingly. Light transmitted through neodymium glasses shows unusually sharp absorption bands; the glass is used in astronomical work to produce sharp bands by which spectral lines may be calibrated. Another application is the creation of selective astronomical filters to reduce the effect of light pollution from sodium and fluorescent lighting while passing other colours, especially dark red hydrogen-alpha emission from nebulae. Neodymium is also used to remove the green color caused by iron contaminants from glass. Neodymium is a component of "didymium" (referring to mixture of salts of neodymium and praseodymium) used for coloring glass to make welder's and glass-blower's goggles; the sharp absorption bands obliterate the strong sodium emission at 589 nm. The similar absorption of the yellow mercury emission line at 578 nm is the principal cause of the blue color observed for neodymium glass under traditional white-fluorescent lighting. Neodymium and didymium glass are used in color-enhancing filters in indoor photography, particularly in filtering out the yellow hues from incandescent lighting. Similarly, neodymium glass is becoming widely used more directly in incandescent light bulbs. These lamps contain neodymium in the glass to filter out yellow light, resulting in a whiter light which is more like sunlight. Similar to its use in glasses, neodymium salts are used as a colorant for enamels. Lasers Certain transparent materials with a small concentration of neodymium ions can be used in lasers as gain media for infrared wavelengths (1054–1064 nm), e.g. Nd:YAG (yttrium aluminium garnet), Nd:YAP (yttrium aluminium perovskite), Nd:YLF (yttrium lithium fluoride), Nd:YVO4 (yttrium orthovanadate), and Nd:glass. Neodymium-doped crystals (typically Nd:YVO4) generate high-powered infrared laser beams which are converted to green laser light in commercial DPSS hand-held lasers and laser pointers. Trivalent neodymium ion Nd3+ was the first lanthanide from rare-earth elements used for the generation of laser radiation. The Nd:CaWO4 laser was developed in 1961. Historically, it was the third laser which was put into operation (the first was ruby, the second the U3+:CaF laser). Over the years the neodymium laser became one of the most used lasers for application purposes. The success of the Nd3+ ion lies in the structure of its energy levels and in the spectroscopic properties suitable for the generation of laser radiation. In 1964 Geusic et al. demonstrated the operation of neodymium ion in YAG matrix Y3Al5O12. It is a four-level laser with lower threshold and with excellent mechanical and temperature properties. For optical pumping of this material it is possible to use non-coherent flashlamp radiation or a coherent diode beam. The current laser at the UK Atomic Weapons Establishment (AWE), the HELEN (High Energy Laser Embodying Neodymium) 1-terawatt neodymium-glass laser, can access the midpoints of pressure and | the bulk metal to the further oxidation: Neodymium is a quite electropositive element, and it reacts slowly with cold water but quite quickly with hot water to form neodymium(III) hydroxide: Neodymium metal reacts vigorously with all the stable halogens: [a violet substance] [a mauve substance] [a violet substance] [a green substance] Neodymium dissolves readily in dilute sulfuric acid to form solutions that contain the lilac Nd(III) ion. These exist as a [Nd(OH2)9]3+ complexes: Compounds Neodymium compounds include: halides: neodymium(III) fluoride; (NdF3); neodymium (IV) fluoride; (NdF4); neodymium(III) chloride (NdCl3); neodymium(III) bromide (NdBr3); neodymium(III) iodide (NdI3) oxides: neodymium(III) oxide () sulfides: neodymium(II) sulfide (NdS), neodymium(III) sulfide () nitrides: neodymium(III) nitride (NdN) hydroxide: neodymium(III) hydroxide () phosphide: neodymium phosphide (NdP) carbide: neodymium carbide () nitrate: neodymium(III) nitrate () sulfate: neodymium(III) sulfate () Some neodymium compounds have colors that vary based upon the type of lighting. Organoneodymium compounds Organoneodymium compounds are very similar to those of the other lanthanides, as they all share an inability to undergo π backbonding. They are thus mostly restricted to the mostly ionic cyclopentadienides (isostructural with those of lanthanum) and the σ-bonded simple alkyls and aryls, some of which may be polymeric. Isotopes Naturally occurring neodymium (60Nd) is composed of five stable isotopes, 142Nd, 143Nd, 145Nd, 146Nd and 148Nd, with 142Nd being the most abundant (27.2% of the natural abundance), and two radioisotopes with extremely long half-lives, 144Nd (alpha decay with a half-life (t1/2) of 2.29×1015 years) and 150Nd (double beta decay, t1/2 = 7×1018 years, approximately). In all, 33 radioisotopes of neodymium have been detected , with the most stable radioisotopes being the naturally occurring ones: 144Nd and 150Nd. All of the remaining radioactive isotopes have half-lives that are shorter than twelve days, and the majority of these have half-lives that are shorter than 70 seconds; the most stable artificial isotope is 147Nd with a half-life of 10.98 days. Neodymium also has 13 known meta states, with the most stable one being 139mNd (t1/2 = 5.5 hours), 135mNd (t1/2 = 5.5 minutes) and 133m1Nd (t1/2 ~70 seconds). The primary decay modes before the most abundant stable isotope, 142Nd, are electron capture and positron decay, and the primary mode after is beta minus decay. The primary decay products before 142Nd are element Pr (praseodymium) isotopes and the primary products after are element Pm (promethium) isotopes. Neodymium isotopes are used in a variety of scientific applications. 142Nd has been used for the production of short-lived Tm and Yb isotopes. 146Nd has been suggested for the production of 147Pm which can be used as a source ofor radioactive power generation. Several neodymium isotopes have been used for the production of other promethium isotopes. The decay from 147Sm (t1/2 = 1.06 × 1011) to143Nd (stable) serve for samarium-neodymium dating. 150Nd has also been used to study double beta decay. History In 1751, the Swedish mineralogist Axel Fredrik Cronstedt discovered a heavy mineral from the mine at Bastnäs, later named cerite. Thirty years later, the fifteen-year-old Wilhelm Hisinger, from the family owning the mine, sent a sample of it to Carl Scheele, who did not find any new elements within. In 1803, after Hisinger had become an ironmaster, he returned to the mineral with Jöns Jacob Berzelius and isolated a new oxide, which they named ceria after the dwarf planet Ceres, which had been discovered two years earlier. Ceria was simultaneously and independently isolated in Germany by Martin Heinrich Klaproth. Between 1839 and 1843, ceria was shown to be a mixture of oxides by the Swedish surgeon and chemist Carl Gustaf Mosander, who lived in the same house as Berzelius; he separated out two other oxides, which he named lanthana and didymia. He partially decomposed a sample of cerium nitrate by roasting it in air and then treating the resulting oxide with dilute nitric acid. The metals that formed these oxides were thus named lanthanum and didymium, officially discovered in Vienna in 1885 by Carl Gustaf Mosander. Von Welsbach confirmed the separation by spectroscopic analysis, but the products were of relatively low purity. Didymium was discovered by Carl Gustaf Mosander in 1841, and pure neodymium was isolated from it in 1925. The name neodymium is derived from the Greek words neos (νέος), new, and didymos (διδύμος), twin. Double nitrate crystallization was the means of commercial neodymium purification until the 1950s. Lindsay Chemical Division was the first to commercialize large-scale ion-exchange purification of neodymium. Starting in the 1950s, high purity (above 99%) neodymium was primarily obtained through an ion exchange process from monazite, a mineral rich in rare-earth elements. The metal is obtained through electrolysis of its halide salts. Currently, most neodymium is extracted from bastnäsite, , and purified by solvent extraction. Ion-exchange purification is reserved for preparing the highest purities (typically >99.99%). The evolving technology, and improved purity of commercially available neodymium oxide, was reflected in the appearance of neodymium glass that resides in collections today. Early neodymium glasses made in the 1930s have a more reddish or orange tinge than modern versions which are more cleanly purple, because of the difficulties in removing the last traces of praseodymium in the era when manufacturing relied upon fractional crystallization technology. Because of its role in permanent magnets used for direct-drive wind turbines, it has been argued that neodymium will be one of the main objects of geopolitical competition in a world running on renewable energy. This perspective has been criticised for failing to recognise that most wind turbines do not use permanent magnets, and for underestimating the power of economic incentives for expanded production. Occurrence and production Occurrence Neodymium is rarely found in nature as a free element, but rather it occurs in ores such as monazite and bastnäsite (these are mineral group names rather than single mineral names) that contain small amounts of all rare-earth metals. In these minerals neodymium is rarely dominant (as in the case of lanthanum), with cerium being the most abundant lanthanide; some exceptions include monazite-(Nd) and kozoite-(Nd). The main mining areas are in China, United States, Brazil, India, Sri Lanka, and Australia. The reserves of neodymium are estimated at eight million tonnes. The Nd3+ ion is similar in size to the early lanthanides of the cerium group (those from lanthanum up to samarium and europium) that immediately follow in the periodic table, and hence it tends to occur along with them in phosphate, silicate and carbonate minerals, such as monazite (MIIIPO4) and bastnäsite (MIIICO3F), where M refers to all the rare-earth metals except scandium and the radioactive promethium (mostly Ce, La, and Y, with somewhat less Pr and Nd). Bastnäsite is usually lacking in thorium and the heavy lanthanides, and the purification of the light lanthanides from it is less involved. The ore, after being crushed and ground, is first treated with hot concentrated sulfuric acid, evolving carbon dioxide, hydrogen fluoride, and silicon tetrafluoride. The product is then dried and leached with water, leaving the early lanthanide ions, including lanthanum, in solution. In space Neodymium's per-particle abundance in the Solar System is 0.083 ppb (parts per billion). This figure is about two thirds of that of platinum, but two and a half times more than mercury, and nearly five times more than gold. The lanthanides are not usually found in space, and the lanthanides are much more abundant in the Earth's crust. In the Earth's crust Neodymium is classified as a lithophile under the Goldschmidt classification, meaning that it is generally found combined with oxygen. Although it belongs to the rare-earth metals, neodymium is not rare at all. Its abundance in the Earth's crust is about 38 mg/kg, which also makes it the 27th most common element. It is similar in abundance to lanthanum. Cerium is the most common rare-earth metal, followed by neodymium, and then lanthanum. Production The world's production of neodymium was about 7,000 tonnes in 2004. The bulk of current production is from China. Historically, the Chinese government imposed strategic material controls on the element, causing large fluctuations in prices. The uncertainty of pricing and availability have caused companies (particularly Japanese ones) to create permanent magnets and associated electric motors with fewer rare-earth metals; however, so far they have been unable to eliminate the need for neodymium. According to the US Geological Survey, Greenland holds the largest reserves of undeveloped rare-earth deposits, particularly neodymium. Mining interests clash with native populations at those sites, due to the release of radioactive substances during the mining process. Neodymium is typically 10–18% of the rare-earth content of commercial deposits of the |
albeit in tiny quantities (a few becquerels). Neptunium's unique radioactive characteristics allowed it to be traced as it moved through various compounds in chemical reactions, at first this was the only method available to prove that its chemistry was different from other elements. As the first isotope of neptunium to be discovered has such a short half-life, McMillan and Abelson were unable to prepare a sample that was large enough to perform chemical analysis of the new element using the technology that was then available. However, after the discovery of the long-lived 237Np isotope in 1942 by Glenn Seaborg and Arthur Wahl, forming weighable amounts of neptunium became a realistic endeavor. Its half-life was initially determined to be about 3 million years (later revised to 2.144 million years), confirming the predictions of Nishina and Kimura of a very long half-life. Early research into the element was somewhat limited because most of the nuclear physicists and chemists in the United States at the time were focused on the massive effort to research the properties of plutonium as part of the Manhattan Project. Research into the element did continue as a minor part of the project and the first bulk sample of neptunium was isolated in 1944. Much of the research into the properties of neptunium since then has been focused on understanding how to confine it as a portion of nuclear waste. Because it has isotopes with very long half-lives, it is of particular concern in the context of designing confinement facilities that can last for thousands of years. It has found some limited uses as a radioactive tracer and a precursor for various nuclear reactions to produce useful plutonium isotopes. However, most of the neptunium that is produced as a reaction byproduct in nuclear power stations is considered to be a waste product. Production Synthesis The vast majority of the neptunium that currently exists on Earth was produced artificially in nuclear reactions. Neptunium-237 is the most commonly synthesized isotope due to it being the only one that both can be created via neutron capture and also has a half-life long enough to allow weighable quantities to be easily isolated. As such, it is by far the most common isotope to be utilized in chemical studies of the element. When an 235U atom captures a neutron, it is converted to an excited state of 236U. About 81% of the excited 236U nuclei undergo fission, but the remainder decay to the ground state of 236U by emitting gamma radiation. Further neutron capture creates 237U which has a half-life of 7 days and quickly decays to 237Np through beta decay. During beta decay, the excited 237U emits an electron, while the atomic weak interaction converts a neutron to a proton, thus creating 237Np. 237U is also produced via an (n,2n) reaction with 238U. This only happens with very energetic neutrons. 237Np is the product of alpha decay of 241Am, which is produced through neutron irradiation of uranium-238. Heavier isotopes of neptunium decay quickly, and lighter isotopes of neptunium cannot be produced by neutron capture, so chemical separation of neptunium from cooled spent nuclear fuel gives nearly pure 237Np. The short-lived heavier isotopes 238Np and 239Np, useful as radioactive tracers, are produced through neutron irradiation of 237Np and 238U respectively, while the longer-lived lighter isotopes 235Np and 236Np are produced through irradiation of 235U with protons and deuterons in a cyclotron. Artificial 237Np metal is usually isolated through a reaction of 237NpF3 with liquid barium or lithium at around 1200 °C and is most often extracted from spent nuclear fuel rods in kilogram amounts as a by-product in plutonium production. 2 NpF3 + 3 Ba → 2 Np + 3 BaF2 By weight, neptunium-237 discharges are about 5% as great as plutonium discharges and about 0.05% of spent nuclear fuel discharges. However, even this fraction still amounts to more than fifty tons per year globally. Purification methods Recovering uranium and plutonium from spent nuclear fuel for reuse is one of the major processes of the nuclear fuel cycle. As it has a long half-life of just over 2 million years, the alpha emitter 237Np is one of the major isotopes of the minor actinides separated from spent nuclear fuel. Many separation methods have been used to separate out the neptunium, operating on small and large scales. The small-scale purification operations have the goals of preparing pure neptunium as a precursor of metallic neptunium and its compounds, and also to isolate and preconcentrate neptunium in samples for analysis. Most methods that separate neptunium ions exploit the differing chemical behaviour of the differing oxidation states of neptunium (from +3 to +6 or sometimes even +7) in solution. Among the methods that are or have been used are: solvent extraction (using various extractants, usually multidentate β-diketone derivatives, organophosphorus compounds, and amine compounds), chromatography using various ion-exchange or chelating resins, coprecipitation (possible matrices include LaF3, BiPO4, BaSO4, Fe(OH)3, and MnO2), electrodeposition, and biotechnological methods. Currently, commercial reprocessing plants use the Purex process, involving the solvent extraction of uranium and plutonium with tributyl phosphate. Chemistry and compounds Solution chemistry When it is in an aqueous solution, neptunium can exist in any of its six possible oxidation states (+2 to +7) and each of these show a characteristic color. The stability of each oxidation state is strongly dependent on various factors, such as the presence of oxidizing or reducing agents, pH of the solution, presence of coordination complex-forming ligands, and even the concentration of neptunium in the solution. In acidic solutions, the neptunium(III) to neptunium(VII) ions exist as Np3+, Np4+, , , and . In basic solutions, they exist as the oxides and hydroxides Np(OH)3, NpO2, NpO2OH, NpO2(OH)2, and . Not as much work has been done to characterize neptunium in basic solutions. Np3+ and Np4+ can easily be reduced and oxidized to each other, as can and . Neptunium(III) Np(III) or Np3+ exists as hydrated complexes in acidic solutions, . It is a dark blue-purple and is analogous to its lighter congener, the pink rare-earth ion Pm3+. In the presence of oxygen, it is quickly oxidized to Np(IV) unless strong reducing agents are also present. Nevertheless, it is the second-least easily hydrolyzed neptunium ion in water, forming the NpOH2+ ion. Np3+ is the predominant neptunium ion in solutions of pH 4–5. Neptunium(IV) Np(IV) or Np4+ is pale yellow-green in acidic solutions, where it exists as hydrated complexes (). It is quite unstable to hydrolysis in acidic aqueous solutions at pH 1 and above, forming NpOH3+. In basic solutions, Np4+ tends to hydrolyze to form the neutral neptunium(IV) hydroxide (Np(OH)4) and neptunium(IV) oxide (NpO2). Neptunium(V) Np(V) or is green-blue in aqueous solution, in which it behaves as a strong Lewis acid. It is a stable ion and is the most common form of neptunium in aqueous solutions. Unlike its neighboring homologues and , does not spontaneously disproportionate except at very low pH and high concentration: 2 + 4 H+ ⇌ Np4+ + + 2 H2O It hydrolyzes in basic solutions to form NpO2OH and . Neptunium(VI) Np(VI) or , the neptunyl ion, shows a light pink or reddish color in an acidic solution and yellow-green otherwise. It is a strong Lewis acid and is the main neptunium ion encountered in solutions of pH 3–4. Though stable in acidic solutions, it is quite easily reduced to the Np(V) ion, and it is not as stable as the homologous hexavalent ions of its neighbours uranium and plutonium (the uranyl and plutonyl ions). It hydrolyzes in basic solutions to form the oxo and hydroxo ions NpO2OH+, , and . Neptunium(VII) Np(VII) is dark green in a strongly basic solution. Though its chemical formula in basic solution is frequently cited as , this is a simplification and the real structure is probably closer to a hydroxo species like . Np(VII) was first prepared in basic solution in 1967. In strongly acidic solution, Np(VII) is found as ; water quickly reduces this to Np(VI). Its hydrolysis products are uncharacterized. Hydroxides The oxides and hydroxides of neptunium are closely related to its ions. In general, Np hydroxides at various oxidation levels are less stable than the actinides before it on the periodic table such as thorium and uranium and more stable than those after it such as plutonium and americium. This phenomenon is because the stability of an ion increases as the ratio of atomic number to the radius of the ion increases. Thus actinides higher on the periodic table will more readily undergo hydrolysis. Neptunium(III) hydroxide is quite stable in acidic solutions and in environments that lack oxygen, but it will rapidly oxidize to the IV state in the presence of air. It is not soluble in water. Np(IV) hydroxides exist mainly as the electrically neutral Np(OH)4 and its mild solubility in water is not affected at all by the pH of the solution. This suggests that the other Np(IV) hydroxide, , does not have a significant presence. Because the Np(V) ion is very stable, it can only form a hydroxide in high acidity levels. When placed in a 0.1 M sodium perchlorate solution, it does not react significantly for a period of months, although a higher molar concentration of 3.0 M will result in it reacting to the solid hydroxide NpO2OH almost immediately. Np(VI) hydroxide is more reactive but it is still fairly stable in acidic solutions. It will form the compound NpO3· H2O in the presence of ozone under various carbon dioxide pressures. Np(VII) has not been well-studied and no neutral hydroxides have been reported. It probably exists mostly as . Oxides Three anhydrous neptunium oxides have been reported, NpO2, Np2O5, and Np5O8, though some studies have stated that only the first two of these exist, suggesting that claims of Np5O8 are actually the result of mistaken analysis of Np2O5. However, as the full extent of the reactions that occur between neptunium and oxygen has yet to be researched, it is not certain which of these claims is accurate. Although neptunium oxides have not been produced with neptunium in oxidation states as high as those possible with the adjacent actinide uranium, neptunium oxides are more stable at lower oxidation states. This behavior is illustrated by the fact that NpO2 can be produced by simply burning neptunium salts of oxyacids in air. The greenish-brown NpO2 is very stable over a large range of pressures and temperatures and does not undergo phase transitions at low temperatures. It does show a phase transition from face-centered cubic to orthorhombic at around 33-37GPa, although it returns to is original phase when pressure is released. It remains stable under oxygen pressures up to 2.84 MPa and temperatures up to 400 °C. Np2O5 is black-brown in color and monoclinic with a lattice size of 418×658×409 picometres. It is relatively unstable and decomposes to NpO2 and O2 at 420-695 °C. Although Np2O5 was initially subject to several studies that claimed to produce it with mutually contradictory methods, it was eventually prepared successfully by heating neptunium peroxide to 300-350 °C for 2–3 hours or by heating it under a layer of water in an ampoule at 180 °C. Neptunium also forms a large number of oxide compounds with a wide variety of elements, although the neptunate oxides formed with alkali metals and alkaline earth metals have been by far the most studied. Ternary neptunium oxides are generally formed by reacting NpO2 with the oxide of another element or by precipitating from an alkaline solution. Li5NpO6 has been prepared by reacting Li2O and NpO2 at 400 °C for 16 hours or by reacting Li2O2 with NpO3 · H2O at 400 °C for 16 hours in a quartz tube and flowing oxygen. Alkali neptunate compounds K3NpO5, Cs3NpO5, and Rb3NpO5 are all created by a similar reaction: NpO2 + 3 MO2 → M3NpO5 (M = K, Cs, Rb) The oxide compounds KNpO4, CsNpO4, and RbNpO4 are formed by reacting Np(VII) () with a compound of the alkali metal nitrate and ozone. Additional compounds have been produced by reacting NpO3 and water with solid alkali and alkaline peroxides at temperatures of 400 - 600 °C for 15–30 hours. Some of these include Ba3(NpO5)2, Ba2NaNpO6, and Ba2LiNpO6. Also, a considerable number of hexavalent neptunium oxides are formed by reacting solid-state NpO2 with various alkali or alkaline earth oxides in an environment of flowing oxygen. Many of the resulting compounds also have an equivalent compound that substitutes uranium for neptunium. Some compounds that have been characterized include Na2Np2O7, Na4NpO5, Na6NpO6, and Na2NpO4. These can be obtained by heating different combinations of NpO2 and Na2O to various temperature thresholds and further heating will also cause these compounds to exhibit different neptunium allotropes. The lithium neptunate oxides Li6NpO6 and Li4NpO5 can be obtained with similar reactions of NpO2 and Li2O. A large number of additional alkali and alkaline neptunium oxide compounds such as Cs4Np5O17 and Cs2Np3O10 have been characterized with various production methods. Neptunium has also been observed to form ternary oxides with many additional elements in groups 3 through 7, although these compounds are much less well studied. Halides Although neptunium halide compounds have not been nearly as well studied as its oxides, a fairly large number have been successfully characterized. Of these, neptunium fluorides have been the most extensively researched, largely because of their potential use in separating the element from nuclear waste products. Four binary neptunium fluoride compounds, NpF3, NpF4, NpF5, and NpF6, have been reported. The first two are fairly stable and were first prepared in 1947 through the following reactions: NpO2 + H2 + 3 HF → NpF3 + 2 H2O (400°C) NpF3 + O2 + HF → NpF4 + H2O (400°C) Later, NpF4 was obtained directly by heating NpO2 to various temperatures in mixtures of either hydrogen fluoride or pure fluorine gas. NpF5 is much more difficult to create and most known preparation methods involve reacting NpF4 or NpF6 compounds with various other fluoride compounds. NpF5 will decompose into NpF4 and NpF6 when heated to around 320 °C. NpF6 or neptunium hexafluoride is extremely volatile, as are its adjacent actinide compounds uranium hexafluoride (UF6) and plutonium hexafluoride (PuF6). This volatility has attracted a large amount of interest to the compound in an attempt to devise a simple method for extracting neptunium from spent nuclear power station fuel rods. NpF6 was first prepared in 1943 by reacting NpF3 and gaseous fluorine at very high temperatures and the first bulk quantities were obtained in 1958 by heating NpF4 and dripping pure fluorine on it in a specially prepared apparatus. Additional methods that have successfully produced neptunium hexafluoride include reacting BrF3 and BrF5 with NpF4 and by reacting several different neptunium oxide and fluoride compounds with anhydrous hydrogen fluorides. Four neptunium oxyfluoride compounds, NpO2F, NpOF3, NpO2F2, and NpOF4, have been reported, although none of them have been extensively studied. NpO2F2 is a pinkish solid and can be prepared by reacting NpO3 · H2O and Np2F5 with pure fluorine at around 330 °C. NpOF3 and NpOF4 can be produced by reacting neptunium oxides with anhydrous hydrogen fluoride at various temperatures. Neptunium also forms a wide variety of fluoride compounds with various elements. Some of these that have been characterized include CsNpF6, Rb2NpF7, Na3NpF8, and K3NpO2F5. Two neptunium chlorides, NpCl3 and NpCl4, have been characterized. Although several attempts to create NpCl5 have been made, they have not been successful. NpCl3 is created by reducing neptunium dioxide with hydrogen and carbon tetrachloride (CCl4) and NpCl4 by reacting a neptunium oxide with CCl4 at around 500 °C. Other neptunium chloride compounds have also been reported, including NpOCl2, Cs2NpCl6, Cs3NpO2Cl4, and Cs2NaNpCl6. Neptunium bromides NpBr3 and NpBr4 have also been created; the latter by reacting aluminium bromide with NpO2 at 350 °C and the former in an almost identical procedure but with zinc present. The neptunium iodide NpI3 has also been prepared by the same method as NpBr3. Chalcogenides, pnictides, and carbides Neptunium chalcogen and pnictogen compounds have been well studied primarily as part of research into their electronic and magnetic properties and their interactions in the natural environment. Pnictide and carbide compounds have also attracted interest because of their presence in the fuel of several advanced nuclear reactor designs, although the latter group has not had nearly as much research as the former. Chalcogenides A wide variety of neptunium sulfide compounds have been characterized, including the pure sulfide compounds NpS, NpS3, Np2S5, Np3S5, Np2S3, and Np3S4. Of these, Np2S3, prepared by reacting NpO2 with hydrogen sulfide and carbon disulfide at around 1000 °C, is the most well-studied and three allotropic forms are known. The α form exists up to around 1230 °C, the β up to 1530 °C, and the γ form, which can also exist as Np3S4, at higher temperatures. NpS can be created by reacting Np2S3 and neptunium metal at 1600 °C and Np3S5 can be prepared by the decomposition of Np2S3 at 500 °C or by reacting sulfur and neptunium hydride at 650 °C. Np2S5 is made by heating a mixture of Np3S5 and pure sulfur to 500 °C. All of the neptunium sulfides except for the β and γ forms of Np2S3 are isostructural with the equivalent uranium sulfide and several, including NpS, α−Np2S3, and β−Np2S3 are also isostructural with the equivalent plutonium sulfide. The oxysulfides NpOS, Np4O4S, and Np2O2S have also been created, although the latter three have not been well studied. NpOS was first prepared in 1985 by vacuum sealing NpO2, Np3S5, and pure sulfur in a quartz tube and heating it to 900 °C for one week. Neptunium selenide compounds that have been reported include NpSe, NpSe3, Np2Se3, Np2Se5, Np3Se4, and Np3Se5. All of these have only been obtained by heating neptunium hydride and selenium metal to various temperatures in a vacuum for an extended period of time and Np2Se3 is only known to exist in the γ allotrope at relatively high temperatures. Two neptunium oxyselenide compounds are known, NpOSe and Np2O2Se, are formed with similar methods by replacing the neptunium hydride with neptunium dioxide. The known neptunium telluride compounds NpTe, NpTe3, Np3Te4, Np2Te3, and Np2O2Te are formed by similar procedures to the selenides and Np2O2Te is isostructural to the equivalent uranium and plutonium compounds. No neptunium−polonium compounds have been reported. Pnictides and carbides Neptunium nitride (NpN) was first prepared in 1953 by reacting neptunium hydride and ammonia gas at around 750 °C in a quartz capillary tube. Later, it was produced by reacting different mixtures of nitrogen and hydrogen with neptunium metal at various temperatures. It has also been created by the reduction of neptunium dioxide with diatomic nitrogen gas at 1550 °C. NpN is isomorphous with uranium mononitride (UN) and plutonium mononitride (PuN) and has a melting point of 2830 °C under a nitrogen pressure of around 1 MPa. Two neptunium phosphide compounds have been reported, NpP and Np3P4. The first has a face centered cubic structure and is prepared by converting neptunium metal to a powder and then reacting it with phosphine gas at 350 °C. Np3P4 can be created by reacting neptunium metal with red phosphorus at 740 °C in a vacuum and then allowing any extra phosphorus to sublimate away. The compound is non-reactive with water but will react with nitric acid to produce Np(IV) solution. Three neptunium arsenide compounds have been prepared, NpAs, NpAs2, and Np3As4. The first two were first created by heating arsenic and neptunium hydride in a vacuum-sealed tube for about a week. Later, NpAs was also made by confining neptunium metal and arsenic in a vacuum tube, separating them with a quartz membrane, and heating them to just below neptunium's melting point of 639 °C, which is slightly higher than the arsenic's sublimation point of 615 °C. Np3As4 is prepared by a similar procedure using iodine as a transporting agent. NpAs2 crystals are brownish gold and Np3As4 is black. The neptunium antimonide compound NpSb was created in 1971 by placing equal quantities of both elements in a vacuum tube, heating them to the melting point of antimony, and then heating it further to 1000 °C for sixteen days. This procedure also created trace amounts of an additional antimonide compound Np3Sb4. One neptunium-bismuth compound, NpBi, has also been reported. The neptunium carbides NpC, Np2C3, and NpC2 (tentative) have been reported, but have not characterized in detail despite the high importance and utility of actinide carbides as advanced nuclear reactor fuel. NpC is a non-stoichiometric compound, and could be better labelled as NpCx (0.82 ≤ x ≤ 0.96). It may be obtained from the reaction of neptunium hydride with graphite at 1400 °C or by heating the constituent elements together in an electric arc furnace using a tungsten electrode. It reacts with excess carbon to form pure Np2C3. NpC2 is formed from heating NpO2 in a graphite crucible at 2660–2800 °C. Other inorganic Hydrides Neptunium reacts with hydrogen in a similar manner to its neighbor plutonium, forming the hydrides NpH2+x (face-centered cubic) and NpH3 (hexagonal). These are isostructural with the corresponding plutonium hydrides, although unlike PuH2+x, the lattice parameters of NpH2+x become greater as the hydrogen content (x) increases. The hydrides require extreme care in handling as they decompose in a vacuum at 300 °C to form finely divided neptunium metal, which is pyrophoric. Phosphates, sulfates, and carbonates Being chemically stable, neptunium phosphates have been investigated for potential use in immobilizing nuclear waste. Neptunium pyrophosphate (α-NpP2O7), a green solid, has been produced in the reaction between neptunium dioxide and boron phosphate at 1100 °C, though neptunium(IV) phosphate has so far remained elusive. The series of compounds NpM2(PO4)3, where M is an alkali metal (Li, Na, K, Rb, or Cs), are all known. Some neptunium sulfates have been characterized, both aqueous and solid and at various oxidation states of neptunium (IV through VI have been observed). Additionally, neptunium carbonates have been investigated to achieve a better understanding of the behavior of neptunium in geological repositories and the environment, where it may come into contact with carbonate and bicarbonate aqueous solutions and form soluble complexes. Organometallic A few organoneptunium compounds are known and chemically characterized, although not as many as for uranium due to neptunium's scarcity and radioactivity. The most well known organoneptunium compounds are the cyclopentadienyl and cyclooctatetraenyl compounds and their derivatives. The trivalent cyclopentadienyl compound Np(C5H5)3·THF was obtained in 1972 from reacting Np(C5H5)3Cl with sodium, although the simpler Np(C5H5) could not be obtained. Tetravalent neptunium cyclopentadienyl, a reddish-brown complex, was synthesized in 1968 by reacting neptunium(IV) chloride with potassium cyclopentadienide: NpCl4 + 4 KC5H5 → Np(C5H5)4 + 4 KCl It is soluble in benzene and THF, and is less sensitive to oxygen and water than Pu(C5H5)3 and Am(C5H5)3. Other Np(IV) cyclopentadienyl compounds are known for many ligands: they have the general formula (C5H5)3NpL, where L represents a ligand. Neptunocene, Np(C8H8)2, was synthesized in 1970 by reacting neptunium(IV) chloride with K2(C8H8). It is isomorphous to uranocene and plutonocene, and they behave chemically identically: all three compounds are insensitive to water or dilute bases but are sensitive to air, reacting quickly to form oxides, and are only slightly soluble in benzene and toluene. Other known neptunium cyclooctatetraenyl derivatives include Np(RC8H7)2 (R = ethanol, butanol) and KNp(C8H8)·2THF, which is isostructural to the corresponding plutonium compound. In addition, neptunium hydrocarbyls have been prepared, and solvated triiodide complexes of neptunium are a precursor to many organoneptunium and inorganic neptunium compounds. Coordination complexes There is much interest in the coordination chemistry of neptunium, because its five oxidation states all exhibit their own distinctive chemical behavior, and the coordination chemistry of the actinides is heavily influenced by the actinide contraction (the greater-than-expected decrease in ionic radii across the actinide series, analogous to the lanthanide contraction). Solid state Few neptunium(III) coordination compounds are known, because Np(III) is readily oxidized by atmospheric oxygen while in aqueous solution. However, sodium formaldehyde sulfoxylate can reduce Np(IV) to Np(III), stabilizing the lower oxidation state and forming various sparingly soluble Np(III) coordination complexes, such as ·11H2O, ·H2O, and . Many neptunium(IV) coordination compounds have been reported, the first one being , which is isostructural with the analogous uranium(IV) coordination compound. Other Np(IV) coordination compounds are known, some involving other metals such as cobalt (·8H2O, formed at 400 K) and copper (·6H2O, formed at 600 K). Complex nitrate compounds are also known: | paramagnetic, NpAl3 is ferromagnetic, NpGe3 has no magnetic ordering, and NpSn3 behaves fermionically. Investigations are underway regarding alloys of neptunium with uranium, americium, plutonium, zirconium, and iron, so as to recycle long-lived waste isotopes such as neptunium-237 into shorter-lived isotopes more useful as nuclear fuel. One neptunium-based superconductor alloy has been discovered with formula NpPd5Al2. This occurrence in neptunium compounds is somewhat surprising because they often exhibit strong magnetism, which usually destroys superconductivity. The alloy has a tetragonal structure with a superconductivity transition temperature of −268.3 °C (4.9 K). Chemical Neptunium has five ionic oxidation states ranging from +3 to +7 when forming chemical compounds, which can be simultaneously observed in solutions. It is the heaviest actinide that can lose all its valence electrons in a stable compound. The most stable state in solution is +5, but the valence +4 is preferred in solid neptunium compounds. Neptunium metal is very reactive. Ions of neptunium are prone to hydrolysis and formation of coordination compounds. Atomic A neptunium atom has 93 electrons, arranged in the configuration [Rn] 5f4 6d1 7s2. This differs from the configuration expected by the Aufbau principle in that one electron is in the 6d subshell instead of being as expected in the 5f subshell. This is because of the similarity of the electron energies of the 5f, 6d, and 7s subshells. In forming compounds and ions, all the valence electrons may be lost, leaving behind an inert core of inner electrons with the electron configuration of the noble gas radon; more commonly, only some of the valence electrons will be lost. The electron configuration for the tripositive ion Np3+ is [Rn] 5f4, with the outermost 7s and 6d electrons lost first: this is exactly analogous to neptunium's lanthanide homolog promethium, and conforms to the trend set by the other actinides with their [Rn] 5fn electron configurations in the tripositive state. The first ionization potential of neptunium was measured to be at most in 1974, based on the assumption that the 7s electrons would ionize before 5f and 6d; more recent measurements have refined this to 6.2657 eV. Isotopes 24 neptunium radioisotopes have been characterized with the most stable being 237Np with a half-life of 2.14 million years, 236Np with a half-life of 154,000 years, and 235Np with a half-life of 396.1 days. All of the remaining radioactive isotopes have half-lives that are less than 4.5 days, and the majority of these have half-lives that are less than 50 minutes. This element also has at least four meta states, with the most stable being 236mNp with a half-life of 22.5 hours. The isotopes of neptunium range in atomic weight from 219.032 u (219Np) to 244.068 u (244Np), though 221Np and 222Np have not yet been reported. Most of the isotopes that are lighter than the most stable one, 237Np, decay primarily by electron capture although a sizable number, most notably 229Np and 230Np, also exhibit various levels of decay via alpha emission to become protactinium. 237Np itself, being the beta-stable isobar of mass number 237, decays almost exclusively by alpha emission into 233Pa, with very rare (occurring only about once in trillions of decays) spontaneous fission and cluster decay (emission of 30Mg to form 207Tl). All of the known isotopes except one that are heavier than this decay exclusively via beta emission. The lone exception, 240mNp, exhibits a rare (>0.12%) decay by isomeric transition in addition to beta emission. 237Np eventually decays to form bismuth-209 and thallium-205, unlike most other common heavy nuclei which decay into isotopes of lead. This decay chain is known as the neptunium series. This decay chain had long been extinct on Earth due to the short half-lives of all of its isotopes above bismuth-209, but is now being resurrected thanks to artificial production of neptunium on the tonne scale. The isotopes neptunium-235, -236, and -237 are predicted to be fissile; only neptunium-237's fissionability has been experimentally shown, with the critical mass being about 60 kg, only about 10 kg more than that of the commonly used uranium-235. Calculated values of the critical masses of neptunium-235, -236, and -237 respectively are 66.2 kg, 6.79 kg, and 63.6 kg: the neptunium-236 value is even lower than that of plutonium-239. In particular, 236Np also has a low neutron cross section. Despite this, a neptunium atomic bomb has never been built: uranium and plutonium have lower critical masses than 235Np and 237Np, and 236Np is difficult to purify as it is not found in quantity in spent nuclear fuel and is nearly impossible to separate in any significant quantities from its parent 237Np. Occurrence Since all isotopes of neptunium have half-lives that are many times shorter than the age of the Earth, any primordial neptunium should have decayed by now. After only about 80 million years, the concentration of even the longest-lived isotope, 237Np, would have been reduced to less than one-trillionth (10−12) of its original amount. Thus neptunium is present in nature only in negligible amounts produced as intermediate decay products of other isotopes. Trace amounts of the neptunium isotopes neptunium-237 and -239 are found naturally as decay products from transmutation reactions in uranium ores. In particular, 239Np and 237Np are the most common of these isotopes; they are directly formed from neutron capture by uranium-238 atoms. These neutrons come from the spontaneous fission of uranium-238, naturally neutron-induced fission of uranium-235, cosmic ray spallation of nuclei, and light elements absorbing alpha particles and emitting a neutron. The half-life of 239Np is very short, although the detection of its much longer-lived daughter 239Pu in nature in 1951 definitively established its natural occurrence. In 1952, 237Np was identified and isolated from concentrates of uranium ore from the Belgian Congo: in these minerals, the ratio of neptunium-237 to uranium is less than or equal to about 10−12 to 1. Most neptunium (and plutonium) now encountered in the environment is due to atmospheric nuclear explosions that took place between the detonation of the first atomic bomb in 1945 and the ratification of the Partial Nuclear Test Ban Treaty in 1963. The total amount of neptunium released by these explosions and the few atmospheric tests that have been carried out since 1963 is estimated to be around 2500 kg. The overwhelming majority of this is composed of the long-lived isotopes 236Np and 237Np since even the moderately long-lived 235Np (half-life 396 days) would have decayed to less than one-billionth (10−9) its original concentration over the intervening decades. An additional very small amount of neptunium, created by neutron irradiation of natural uranium in nuclear reactor cooling water, is released when the water is discharged into rivers or lakes. The concentration of 237Np in seawater is approximately 6.5 × 10−5 millibecquerels per liter: this concentration is between 0.1% and 1% that of plutonium. Once in the environment, neptunium generally oxidizes fairly quickly, usually to the +4 or +5 state. Regardless of its oxidation state, the element exhibits much greater mobility than the other actinides, largely due to its ability to readily form aqueous solutions with various other elements. In one study comparing the diffusion rates of neptunium(V), plutonium(IV), and americium(III) in sandstone and limestone, neptunium penetrated more than ten times as well as the other elements. Np(V) will also react efficiently in pH levels greater than 5.5 if there are no carbonates present and in these conditions it has also been observed to readily bond with quartz. It has also been observed to bond well with goethite, ferric oxide colloids, and several clays including kaolinite and smectite. Np(V) does not bond as readily to soil particles in mildly acidic conditions as its fellow actinides americium and curium by nearly an order of magnitude. This behavior enables it to migrate rapidly through the soil while in solution without becoming fixed in place, contributing further to its mobility. Np(V) is also readily absorbed by concrete, which because of the element's radioactivity is a consideration that must be addressed when building nuclear waste storage facilities. When absorbed in concrete, it is reduced to Np(IV) in a relatively short period of time. Np(V) is also reduced by humic acid if it is present on the surface of goethite, hematite, and magnetite. Np(IV) is absorbed efficiently by tuff, granodiorite, and bentonite; although uptake by the latter is most pronounced in mildly acidic conditions. It also exhibits a strong tendency to bind to colloidal particulates, an effect that is enhanced when in soil with high clay content. The behavior provides an additional aid in the element's observed high mobility. History Background and early claims When the first periodic table of the elements was published by Dmitri Mendeleev in the early 1870s, it showed a " — " in place after uranium similar to several other places for then-undiscovered elements. Other subsequent tables of known elements, including a 1913 publication of the known radioactive isotopes by Kasimir Fajans, also show an empty place after uranium, element 92. Up to and after the discovery of the final component of the atomic nucleus, the neutron in 1932, most scientists did not seriously consider the possibility of elements heavier than uranium. While nuclear theory at the time did not explicitly prohibit their existence, there was little evidence to suggest that they did. However, the discovery of induced radioactivity by Irène and Frédéric Joliot-Curie in late 1933 opened up an entirely new method of researching the elements and inspired a small group of Italian scientists led by Enrico Fermi to begin a series of experiments involving neutron bombardment. Although the Joliot-Curies' experiment involved bombarding a sample of 27Al with alpha particles to produce the radioactive 30P, Fermi realized that using neutrons, which have no electrical charge, would most likely produce even better results than the positively charged alpha particles. Accordingly, in March 1934 he began systematically subjecting all of the then-known elements to neutron bombardment to determine whether others could also be induced to radioactivity. After several months of work, Fermi's group had tentatively determined that lighter elements would disperse the energy of the captured neutron by emitting a proton or alpha particle and heavier elements would generally accomplish the same by emitting a gamma ray. This latter behavior would later result in the beta decay of a neutron into a proton, thus moving the resulting isotope one place up the periodic table. When Fermi's team bombarded uranium, they observed this behavior as well, which strongly suggested that the resulting isotope had an atomic number of 93. Fermi was initially reluctant to publicize such a claim, but after his team observed several unknown half-lives in the uranium bombardment products that did not match those of any known isotope, he published a paper entitled Possible Production of Elements of Atomic Number Higher than 92 in June 1934. In it he proposed the name ausonium (atomic symbol Ao) for element 93, after the Greek name Ausonia (Italy). Several theoretical objections to the claims of Fermi's paper were quickly raised; in particular, the exact process that took place when an atom captured a neutron was not well understood at the time. This and Fermi's accidental discovery three months later that nuclear reactions could be induced by slow neutrons cast further doubt in the minds of many scientists, notably Aristid von Grosse and Ida Noddack, that the experiment was creating element 93. While von Grosse's claim that Fermi was actually producing protactinium (element 91) was quickly tested and disproved, Noddack's proposal that the uranium had been shattered into two or more much smaller fragments was simply ignored by most because existing nuclear theory did not include a way for this to be possible. Fermi and his team maintained that they were in fact synthesizing a new element, but the issue remained unresolved for several years. Although the many different and unknown radioactive half-lives in the experiment's results showed that several nuclear reactions were occurring, Fermi's group could not prove that element 93 was being created unless they could isolate it chemically. They and many other scientists attempted to accomplish this, including Otto Hahn and Lise Meitner who were among the best radiochemists in the world at the time and supporters of Fermi's claim, but they all failed. Much later, it was determined that the main reason for this failure was because the predictions of element 93's chemical properties were based on a periodic table which lacked the actinide series. This arrangement placed protactinium below tantalum, uranium below tungsten, and further suggested that element 93, at that point referred to as eka-rhenium, should be similar to the group 7 elements, including manganese and rhenium. Thorium, protactinium, and uranium, with their dominant oxidation states of +4, +5, and +6 respectively, fooled scientists into thinking they belonged below hafnium, tantalum, and tungsten, rather than below the lanthanide series, which was at the time viewed as a fluke, and whose members all have dominant +3 states; neptunium, on the other hand, has a much rarer, more unstable +7 state, with +4 and +5 being the most stable. Upon finding that plutonium and the other transuranic elements also have dominant +3 and +4 states, along with the discovery of the f-block, the actinide series was firmly established. While the question of whether Fermi's experiment had produced element 93 was stalemated, two additional claims of the discovery of the element appeared, although unlike Fermi, they both claimed to have observed it in nature. The first of these claims was by Czech engineer Odolen Koblic in 1934 when he extracted a small amount of material from the wash water of heated pitchblende. He proposed the name bohemium for the element, but after being analyzed it turned out that the sample was a mixture of tungsten and vanadium. The other claim, in 1938 by Romanian physicist Horia Hulubei and French chemist Yvette Cauchois, claimed to have discovered the new element via spectroscopy in minerals. They named their element sequanium, but the claim was discounted because the prevailing theory at the time was that if it existed at all, element 93 would not exist naturally. However, as neptunium does in fact occur in nature in trace amounts, as demonstrated when it was found in uranium ore in 1952, it is possible that Hulubei and Cauchois did in fact observe neptunium. Although by 1938 some scientists, including Niels Bohr, were still reluctant to accept that Fermi had actually produced a new element, he was nevertheless awarded the Nobel Prize in Physics in November 1938 "for his demonstrations of the existence of new radioactive elements produced by neutron irradiation, and for his related discovery of nuclear reactions brought about by slow neutrons". A month later, the almost totally unexpected discovery of nuclear fission by Hahn, Meitner, and Otto Frisch put an end to the possibility that Fermi had discovered element 93 because most of the unknown half-lives that had been observed by Fermi's team were rapidly identified as those of fission products. Perhaps the closest of all attempts to produce the missing element 93 was that conducted by the Japanese physicist Yoshio Nishina working with chemist Kenjiro Kimura in 1940, just before the outbreak of the Pacific War in 1941: they bombarded 238U with fast neutrons. However, while slow neutrons tend to induce neutron capture through a (n, γ) reaction, fast neutrons tend to induce a "knock-out" (n, 2n) reaction, where one neutron is added and two more are removed, resulting in the net loss of a neutron. Nishina and Kimura, having tested this technique on 232Th and successfully produced the known 231Th and its long-lived beta decay daughter 231Pa (both occurring in the natural decay chain of 235U), therefore correctly assigned the new 6.75-day half-life activity they observed to the new isotope 237U. They confirmed that this isotope was also a beta emitter and must hence decay to the unknown nuclide 23793. They attempted to isolate this nuclide by carrying it with its supposed lighter congener rhenium, but no beta or alpha decay was observed from the rhenium-containing fraction: Nishina and Kimura thus correctly speculated that the half-life of 23793, like that of 231Pa, was very long and hence its activity would be so weak as to be unmeasurable by their equipment, thus concluding the last and closest unsuccessful search for transuranic elements. Discovery As research on nuclear fission progressed in early 1939, Edwin McMillan at the Berkeley Radiation Laboratory of the University of California, Berkeley decided to run an experiment bombarding uranium using the powerful 60-inch (1.52 m) cyclotron that had recently been built at the university. The purpose was to separate the various fission products produced by the bombardment by exploiting the enormous force that the fragments gain from their mutual electrical repulsion after fissioning. Although he did not discover anything of note from this, McMillan did observe two new beta decay half-lives in the uranium trioxide target itself, which meant that whatever was producing the radioactivity had not violently repelled each other like normal fission products. He quickly realized that one of the half-lives closely matched the known 23-minute decay period of uranium-239, but the other half-life of 2.3 days was unknown. McMillan took the results of his experiment to chemist and fellow Berkeley professor Emilio Segrè to attempt to isolate the source of the radioactivity. Both scientists began their work using the prevailing theory that element 93 would have similar chemistry to rhenium, but Segrè rapidly determined that McMillan's sample was not at all similar to rhenium. Instead, when he reacted it with hydrogen fluoride (HF) with a strong oxidizing agent present, it behaved much like members of the rare earths. Since these elements comprise a large percentage of fission products, Segrè and McMillan decided that the half-life must have been simply another fission product, titling the paper "An Unsuccessful Search for Transuranium Elements". However, as more information about fission became available, the possibility that the fragments of nuclear fission could still have been present in the target became more remote. McMillan and several scientists, including Philip H. Abelson, attempted again to determine what was producing the unknown half-life. In early 1940, McMillan realized that his 1939 experiment with Segrè had failed to test the chemical reactions of the radioactive source with sufficient rigor. In a new experiment, McMillan tried subjecting the unknown substance to HF in the presence of a reducing agent, something he had not done before. This reaction resulted in the sample precipitating with the HF, an action that definitively ruled out the possibility that the unknown substance was a rare-earth metal. Shortly after this, Abelson, who had received his graduate degree from the university, visited Berkeley for a short vacation and McMillan asked the more able chemist to assist with the separation of the experiment's results. Abelson very quickly observed that whatever was producing the 2.3-day half-life did not have chemistry like any known element and was actually more similar to uranium than a rare-earth metal. This discovery finally allowed the source to be isolated and later, in 1945, led to the classification of the actinide series. As a final step, McMillan and Abelson prepared a much larger sample of bombarded uranium that had a prominent 23-minute half-life from 239U and demonstrated conclusively that the unknown 2.3-day half-life increased in strength in concert with a decrease in the 23-minute activity through the following reaction: {^{238}_{92}U} + {^{1}_{0}n} -> {^{239}_{92}U} ->[\beta^-][23\ \ce{min}] {^{239}_{93}Np} ->[\beta^-][2.355\ \ce{days}] {^{239}_{94}Pu} (The times are half-lives.) This proved that the unknown radioactive source originated from the decay of uranium and, coupled with the previous observation that the source was different chemically from all known elements, proved beyond all doubt that a new element had been discovered. McMillan and Abelson published their results in a paper entitled Radioactive Element 93 in the Physical Review on May 27, 1940. They did not propose a name for the element in the paper, but they soon decided on the name neptunium since Neptune is the next planet beyond Uranus in our solar system. McMillan and Abelson's success compared to Nishina and Kimura's near miss can be attributed to the favorable half-life of 239Np for radiochemical analysis and quick decay of 239U, in contrast to the slower decay of 237U and extremely long half-life of 237Np. Subsequent developments It was also realized that the beta decay of 239Np must produce an isotope of element 94 (now called plutonium), but the quantities involved in McMillan and Abelson's original experiment were too small to isolate and identify plutonium along with neptunium. The discovery of plutonium had to wait until the end of 1940, when Glenn T. Seaborg and his team identified the isotope plutonium-238. In 1942, Hahn and Fritz Strassmann, and independently Kurt Starke, reported the confirmation of element 93 in Berlin. Hahn's group did not pursue element 94, likely because they were discouraged by McMillan and Abelson's lack of success in isolating it. Since they had access to the stronger cyclotron at Paris at this point, Hahn's group would likely have been able to detect element 94 had they tried, albeit in tiny quantities (a few becquerels). Neptunium's unique radioactive characteristics allowed it to be traced as it moved through various compounds in chemical reactions, at first this was the only method available to prove that its chemistry was different from other elements. As the first isotope of neptunium to be discovered has such a short half-life, McMillan and Abelson were unable to prepare a sample that was large enough to perform chemical analysis of the |
differences were then attributed to "resolution and drift problems", although these had not been previously reported and should also have influenced other results. 1977 experiments showed that 252102 indeed had a 2.3-second half-life. However, 1973 work also showed that the 250Fm recoil could have also easily been produced from the isomeric transition of 250mFm (half-life 1.8 s) which could also have been formed in the reaction at the energy used. Given this, it is probable that no nobelium was actually produced in this experiment. In 1959, the team continued their studies and claimed that they were able to produce an isotope that decayed predominantly by emission of an 8.3 MeV alpha particle, with a half-life of 3 s with an associated 30% spontaneous fission branch. The activity was initially assigned to 254102 but later changed to 252102. However, they also noted that it was not certain that nobelium had been produced due to difficult conditions. The Berkeley team decided to adopt the proposed name of the Swedish team, "nobelium", for the element. + → → + 4 Meanwhile, in Dubna, experiments were carried out in 1958 and 1960 aiming to synthesize element 102 as well. The first 1958 experiment bombarded plutonium-239 and -241 with oxygen-16 ions. Some alpha decays with energies just over 8.5 MeV were observed, and they were assigned to 251,252,253102, although the team wrote that formation of isotopes from lead or bismuth impurities (which would not produce nobelium) could not be ruled out. While later 1958 experiments noted that new isotopes could be produced from mercury, thallium, lead, or bismuth impurities, the scientists still stood by their conclusion that element 102 could be produced from this reaction, mentioning a half-life of under 30 seconds and a decay energy of (8.8 ± 0.5) MeV. Later 1960 experiments proved that these were background effects. 1967 experiments also lowered the decay energy to (8.6 ± 0.4) MeV, but both values are too high to possibly match those of 253No or 254No. The Dubna team later stated in 1970 and again in 1987 that these results were not conclusive. In 1961, Berkeley scientists claimed the discovery of element 103 in the reaction of californium with boron and carbon ions. They claimed the production of the isotope 257103, and also claimed to have synthesized an alpha decaying isotope of element 102 that had a half-life of 15 s and alpha decay energy 8.2 MeV. They assigned this to 255102 without giving a reason for the assignment. The values do not agree with those now known for 255No, although they do agree with those now known for 257No, and while this isotope probably played a part in this experiment, its discovery was inconclusive. Work on element 102 also continued in Dubna, and in 1964, experiments were carried out there to detect alpha-decay daughters of element 102 isotopes by synthesizing element 102 from the reaction of a uranium-238 target with neon ions. The products were carried along a silver catcher foil and purified chemically, and the isotopes 250Fm and 252Fm were detected. The yield of 252Fm was interpreted as evidence that its parent 256102 was also synthesized: as it was noted that 252Fm could also be produced directly in this reaction by the simultaneous emission of an alpha particle with the excess neutrons, steps were taken to ensure that 252Fm could not go directly to the catcher foil. The half-life detected for 256102 was 8 s, which is much higher than the more modern 1967 value of (3.2 ± 0.2) s. Further experiments were conducted in 1966 for 254102, using the reactions 243Am(15N,4n)254102 and 238U(22Ne,6n)254102, finding a half-life of (50 ± 10) s: at that time the discrepancy between this value and the earlier Berkeley value was not understood, although later work proved that the formation of the isomer 250mFm was less likely in the Dubna experiments than at the Berkeley ones. In hindsight, the Dubna results on 254102 were probably correct and can be now considered a conclusive detection of element 102. One more very convincing experiment from Dubna was published in 1966, again using the same two reactions, which concluded that 254102 indeed had a half-life much longer than the 3 seconds claimed by Berkeley. Later work in 1967 at Berkeley and 1971 at the Oak Ridge National Laboratory fully confirmed the discovery of element 102 and clarified earlier observations. In December 1966, the Berkeley group repeated the Dubna experiments and fully confirmed them, and used this data to finally assign correctly the isotopes they had previously synthesized but could not yet identify at the time, and thus claimed to have discovered nobelium in 1958 to 1961. + → → + 6 In 1969, the Dubna team carried out chemical experiments on element 102 and concluded that it behaved as the heavier homologue of ytterbium. The Russian scientists proposed the name joliotium (Jo) for the new element after Irène Joliot-Curie, who had recently died, creating an element naming controversy that would not be resolved for several decades, which each group using its own proposed names. In 1992, the IUPAC-IUPAP Transfermium Working Group (TWG) reassessed the claims of discovery and concluded that only the Dubna work from 1966 correctly detected and assigned decays to nuclei with atomic number 102 at the time. The Dubna team are therefore officially recognised as the discoverers of nobelium although it is possible that it was detected at Berkeley in 1959. This decision was criticized by Berkeley the following year, calling the reopening of the cases of elements 101 to 103 a "futile waste of time", while Dubna agreed with IUPAC's decision. In 1994, as part of an attempted resolution to the element naming controversy, IUPAC ratified names for elements 101–109. For element 102, it ratified the name nobelium (No) on the basis that it had become entrenched in the literature over the course of 30 years and that Alfred Nobel should be commemorated in this fashion. Because of outcry over the 1994 names, which mostly did not respect the choices of the discoverers, a comment period ensued, and in 1995 IUPAC named element 102 flerovium (Fl) as part of a new proposal, after either Georgy Flyorov or his eponymous Flerov Laboratory of Nuclear Reactions. This proposal was also not accepted, and in 1997 the name nobelium was restored. Today the name flerovium, with the same symbol, refers to element 114. Characteristics Physical In the periodic table, nobelium is located to the right of the actinide mendelevium, to the left of the actinide lawrencium, and below the lanthanide ytterbium. Nobelium metal has not yet been prepared in bulk quantities, and bulk preparation is currently impossible. Nevertheless, a number of predictions and some preliminary experimental results have been done regarding its properties. The lanthanides and actinides, in the metallic state, can exist as either divalent (such as europium and ytterbium) or trivalent (most other lanthanides) metals. The former have fns2 configurations, whereas the latter have fn−1d1s2 configurations. In 1975, Johansson and Rosengren examined the measured and predicted values for the cohesive energies (enthalpies of crystallization) of the metallic lanthanides and actinides, both as divalent and trivalent metals. The conclusion was that the increased binding energy of the [Rn]5f136d17s2 configuration over the [Rn]5f147s2 configuration for nobelium was not enough to compensate for the energy needed to promote one 5f electron to 6d, as is true also for the very late actinides: thus einsteinium, fermium, mendelevium, and nobelium were expected to be divalent metals, although for nobelium this prediction has not yet been confirmed. The increasing predominance of the divalent state well before the actinide series concludes is attributed to the relativistic stabilization of the 5f electrons, which increases with increasing atomic number: an effect of this is that nobelium is predominantly divalent instead of trivalent, unlike all the other lanthanides and actinides. In 1986, nobelium metal was estimated to have an enthalpy of sublimation between 126 kJ/mol, a value close to the values for einsteinium, fermium, and mendelevium and supporting the theory that nobelium would form a divalent metal. Like the other divalent late actinides (except the once again trivalent lawrencium), metallic nobelium should assume a face-centered cubic crystal structure. Divalent nobelium metal should have a metallic radius of around 197 pm. Nobelium's melting point has been predicted to be 827 °C, the same value as that estimated for the neighboring element mendelevium. Its density is predicted to be around 9.9 ± 0.4 g/cm3. Chemical The chemistry of nobelium is incompletely characterized and is known only in aqueous solution, in which it can take on the +3 or +2 oxidation states, the latter being more stable. It was largely expected before the discovery of nobelium that in solution, it would behave like the other actinides, with the trivalent state being predominant; however, Seaborg predicted in 1949 that the +2 state would also be relatively stable for nobelium, as the No2+ ion would have the ground-state electron configuration [Rn]5f14, including the stable filled 5f14 shell. It took nineteen years before this prediction was confirmed. In 1967, experiments were conducted to compare nobelium's chemical behavior to that | scale. Chemistry experiments have confirmed that nobelium behaves as a heavier homolog to ytterbium in the periodic table. The chemical properties of nobelium are not completely known: they are mostly only known in aqueous solution. Before nobelium's discovery, it was predicted that it would show a stable +2 oxidation state as well as the +3 state characteristic of the other actinides; these predictions were later confirmed, as the +2 state is much more stable than the +3 state in aqueous solution and it is difficult to keep nobelium in the +3 state. In the 1950s and 1960s, many claims of the discovery of nobelium were made from laboratories in Sweden, the Soviet Union, and the United States. Although the Swedish scientists soon retracted their claims, the priority of the discovery and therefore the naming of the element was disputed between Soviet and American scientists, and it was not until 1997 that the International Union of Pure and Applied Chemistry (IUPAC) credited the Soviet team with the discovery, but retained nobelium, the Swedish proposal, as the name of the element due to its long-standing use in the literature. Introduction Discovery The discovery of element 102 was a complicated process and was claimed by groups from Sweden, the United States, and the Soviet Union. The first complete and incontrovertible report of its detection only came in 1966 from the Joint Institute of Nuclear Research at Dubna (then in the Soviet Union). The first announcement of the discovery of element 102 was announced by physicists at the Nobel Institute in Sweden in 1957. The team reported that they had bombarded a curium target with carbon-13 ions for twenty-five hours in half-hour intervals. Between bombardments, ion-exchange chemistry was performed on the target. Twelve out of the fifty bombardments contained samples emitting (8.5 ± 0.1) MeV alpha particles, which were in drops which eluted earlier than fermium (atomic number Z = 100) and californium (Z = 98). The half-life reported was 10 minutes and was assigned to either 251102 or 253102, although the possibility that the alpha particles observed were from a presumably short-lived mendelevium (Z = 101) isotope created from the electron capture of element 102 was not excluded. The team proposed the name nobelium (No) for the new element, which was immediately approved by IUPAC, a decision which the Dubna group characterized in 1968 as being hasty. In 1958, scientists at the Lawrence Berkeley National Laboratory repeated the experiment. The Berkeley team, consisting of Albert Ghiorso, Glenn T. Seaborg, John R. Walton and Torbjørn Sikkeland, used the new heavy-ion linear accelerator (HILAC) to bombard a curium target (95% 244Cm and 5% 246Cm) with 13C and 12C ions. They were unable to confirm the 8.5 MeV activity claimed by the Swedes but were instead able to detect decays from fermium-250, supposedly the daughter of 254102 (produced from the curium-246), which had an apparent half-life of ~3 s. In 1959, the Swedish team attempted to explain the Berkeley team's inability to detect element 102 in 1958, maintaining that they did discover it. However, later work has shown that no nobelium isotopes lighter than 259No (no heavier isotopes could have been produced in the Swedish experiments) with a half-life over 3 minutes exist, and that the Swedish team's results are most likely from thorium-225, which has a half-life of 8 minutes and quickly undergoes triple alpha decay to polonium-213, which has a decay energy of 8.53612 MeV. This hypothesis is lent weight by the fact that thorium-225 can easily be produced in the reaction used and would not be separated out by the chemical methods used. Later work on nobelium also showed that the divalent state is more stable than the trivalent one and hence that the samples emitting the alpha particles could not have contained nobelium, as the divalent nobelium would not have eluted with the other trivalent actinides. Thus, the Swedish team later retracted their claim and associated the activity to background effects. Later 1963 Dubna work confirmed that 254102 could be produced in this reaction, but that its half-life was actually . In 1967, the Berkeley team attempted to defend their work, stating that the isotope found was indeed 250Fm but the isotope that the half-life measurements actually related to was californium-244, granddaughter of 252102, produced from the more abundant curium-244. Energy differences were then attributed to "resolution and drift problems", although these had not been previously reported and should also have influenced other results. 1977 experiments showed that 252102 indeed had a 2.3-second half-life. However, 1973 work also showed that the 250Fm recoil could have also easily been produced from the isomeric transition of 250mFm (half-life 1.8 s) which could also have been formed in the reaction at the energy used. Given this, it is probable that no nobelium was actually produced in this experiment. In 1959, the team continued their studies and claimed that they were able to produce an isotope that decayed predominantly by emission of an 8.3 MeV alpha particle, with a half-life of 3 s with an associated 30% spontaneous fission branch. The activity was initially assigned to 254102 but later changed to 252102. However, they also noted that it was not certain that nobelium had been produced due to difficult conditions. The Berkeley team decided to adopt the proposed name of the Swedish team, "nobelium", for the element. + → → + 4 Meanwhile, in Dubna, experiments were carried out in 1958 and 1960 aiming to synthesize element 102 as well. The first 1958 experiment bombarded plutonium-239 and -241 with oxygen-16 ions. Some alpha decays with energies just over 8.5 MeV were observed, and they were assigned to 251,252,253102, although the team wrote that formation of isotopes from lead or bismuth impurities (which would not produce nobelium) could |
of years. The coastal waters of the remote Lofoten islands are one of the richest fishing areas in Europe, as most of the Atlantic cod swims to the coastal waters of Lofoten in the winter to spawn. So in the 19th century, dried cod was one of Norway's main exports and by far the most important industry in northern Norway. Strong sea currents, maelstroms, and especially frequent storms made fishing a dangerous occupation: several hundred men died on the "Fatal Monday" in March 1821, 300 of them from a single parish, and about a hundred boats with their crews were lost within a short time in April 1875. Over the last century, the Norwegian Sea has been suffering from overfishing. In 2018, 41% of stocks were excessively harvested. Two out of sixteen of the Total Allowed Catches (TACs) agreed upon by the European Union (EU) and Norway follow scientific advice. Nine of those TACs are at least 25% above scientific advice. While the other five are set above scientific evidence when excluding landing obligation. Under the Common Fisheries Policy (CFP), the EU committed to phase out overfishing by 2015, 2020 at the absolute latest. As of 2019, the EU was reported to not be on path to achieving that goal. Whaling was also important for the Norwegian Sea. In the early 1600s, the Englishman Stephen Bennet started hunting walrus at Bear Island. In May 1607 the Muscovy Company, while looking for the Northwest Passage and exploring the sea, discovered the large populations of walrus and whales in the Norwegian Sea and started hunting them in 1610 near Spitsbergen. Later in the 17th century, Dutch ships started hunting bowhead whales near Jan Mayen; the bowhead population between Svalbard and Jan Mayen was then about 25,000 individuals. Britons and Dutch were then joined by Germans, Danes, and Norwegians. Between 1615 and 1820, the waters between Jan Mayen, Svalbard, Bear Island, and Greenland, between the Norwegian, Greenland, and Barents Seas, were the most productive whaling area in the world. However, extensive hunting had wiped out the whales in that region by the early 20th century. Sea monsters and maelstroms For many centuries, the Norwegian Sea was regarded as the edge of the known world. The disappearance of ships there, due to the natural disasters, induced legends of monsters that stopped and sank ships (kraken). As late as in 1845, the Encyclopædia metropolitana contained a multi-page review by Erik Pontoppidan (1698–1764) on ship-sinking sea monsters half a mile in size. Many legends might be based on the work Historia de gentibus septentrionalibus of 1539 by Olaus Magnus, which described the kraken and maelstroms of the Norwegian Sea. The kraken also appears in Alfred Tennyson's poem of the same name, in Herman Melville's Moby Dick, and in Twenty Thousand Leagues Under the Sea by Jules Verne. Between the Lofoten islands of Moskenesøya and Værøy, at the tiny Mosken island, lies the Moskenstraumen – a system of tidal eddies and a whirlpool called a maelstrom. With a speed on the order of (the value strongly varies between sources), it is one of the strongest maelstroms in the world. It was described in the 13th century in the Old Norse Poetic Edda and remained an attractive subject for painters and writers, including Edgar Allan Poe, Walter Moers and Jules Verne. The word was introduced into the English language by Poe in his story "A Descent into the Maelström" (1841) describing the Moskenstraumen. The Moskenstraumen is created as a result of a combination of several factors, including the tides, the position of the Lofoten, and the underwater topography; unlike most other whirlpools, it is located in the open sea rather than in a channel or bay. With a diameter of 40–50 metres, it can be dangerous even in modern times to small fishing vessels that might be attracted by the abundant cod feeding on the microorganisms sucked in by the whirlpool. Exploration The fish-rich coastal waters of northern Norway have long been known and attracted skilled sailors from Iceland and Greenland. Thus most settlements in Iceland and Greenland were on the west coasts of the islands, which were also warmer due to the Atlantic currents. The first reasonably reliable map of northern Europe, the Carta marina of 1539, represents the Norwegian Sea as coastal waters and shows nothing north of the North Cape. The Norwegian Sea off the coast regions appeared on the maps in the 17th century as an important part of the then sought Northern Sea Route and a rich whaling ground. Jan Mayen island was discovered in 1607 and become an important base of Dutch whalers. The Dutchman Willem Barents discovered Bear Island and Svalbard, which was then used by Russian whalers called pomors. The islands on the edge of the Norwegian Sea have been rapidly divided between nations. During the peaks of whaling, some 300 ships with 12,000 crew members were yearly visiting Svalbard. The first depth measurements of the Norwegian Sea were performed in 1773 by Constantine Phipps aboard HMS Racehorse, as a part of his North Pole expedition. Systematic oceanographic research in the Norwegian Sea started in the late 19th century, when declines in the yields of cod and herring off the Lofoten prompted the Norwegian government to investigate the matter. The zoologist Georg Ossian Sars and meteorologist Henrik Mohn persuaded the government in 1874 to send out a scientific expedition, and between 1876 and 1878 they explored much of the sea aboard Vøringen. The data obtained allowed Mohn to establish the first dynamic model of ocean currents, which incorporated winds, pressure differences, sea water temperature, and salinity and agreed well with later measurements. In 2019, deposits of iron, copper, zink and cobalt were found on the Mohn Ridge, likely from hydrothermal vents. Navigation Until the 20th century, the coasts of the Norwegian Sea were sparsely populated and therefore shipping in the sea was mostly focused on fishing, whaling, and occasional coastal transportation. Since the late 19th century, the Norwegian Coastal Express sea line has been established, connecting the more densely populated south with the north of Norway by at least one trip a day. The importance of shipping in the Norwegian Sea also increased with the expansion of the Russian and Soviet navies in the Barents Sea and development of international routes to the Atlantic through the Baltic Sea, Kattegat, Skagerrak, and North Sea. The Norwegian Sea is ice-free and provides a direct route from the Atlantic to the Russian ports in the Arctic (Murmansk, Arkhangelsk, and Kandalaksha), which are directly linked to central Russia. This route was extensively used for supplies during World War II – of 811 US ships, 720 reached Russian ports, bringing some 4 million tonnes of cargo that included about 5,000 tanks and 7,000 aircraft. The Allies lost 18 convoys and 89 merchant ships on this route. The major operations of the German Navy against the convoys included PQ 17 in July 1942, the Battle of the Barents Sea in December 1942, and the Battle of the North Cape in December 1943 and were carried out around the border between the Norwegian Sea and Barents Sea, near the North Cape. Navigation across the Norwegian Sea declined after World War II and intensified only in the 1960s–70s with the expansion of the Soviet Northern Fleet, which was reflected in major joint naval exercises of the Soviet Northern Baltic fleets in the Norwegian Sea. The sea was the gateway for the Soviet Navy to the Atlantic Ocean and thus to the United States, and the major Soviet port of Murmansk was just behind the border of the Norwegian and Barents Sea. The countermeasures by the NATO countries resulted in a significant naval presence in the Norwegian Sea and intense cat-and-mouse games between Soviet and NATO aircraft, ships, and especially submarines. A | twice as high in winter as in summer. While at the Faroe-Shetland Channel it has a temperature of about 9.5 °C; it cools to about 5 °C at Svalbard and releases this energy (about 250 terawatts) to the environment. The current flowing from the North Sea originates in the Baltic Sea and thus collects most of the drainage from northern Europe; this contribution is however relatively small. The temperature and salinity of this current show strong seasonal and annual fluctuations. Long-term measurements within the top 50 metres near the coast show a maximum temperature of 11.2 °C at the 63° N parallel in September and a minimum of 3.9 °C at the North Cape in March. The salinity varies between 34.3 and 34.6‰ and is lowest in spring owing to the inflow of melted snow from rivers. The largest rivers discharging into the sea are Namsen, Ranelva and Vefsna. They are all relatively short, but have a high discharge rate owing to their steep mountainous nature. A portion of the warm surface water flows directly, within the West Spitsbergen Current, from the Atlantic Ocean, off the Greenland Sea, to the Arctic Ocean. This current has a speed of 3–5 Sv and has a large impact on the climate. Other surface water (~1 Sv) flows along the Norwegian coast in the direction of the Barents Sea. This water may cool enough in the Norwegian Sea to submerge into the deeper layers; there it displaces water that flows back into the North Atlantic. Arctic water from the East Iceland Current is mostly found in the southwestern part of the sea, near Greenland. Its properties also show significant annual fluctuations, with long-term average temperature being below 3 °C and salinity between 34.7 and 34.9‰. The fraction of this water on the sea surface depends on the strength of the current, which in turn depends on the pressure difference between the Icelandic Low and Azores High: the larger the difference, the stronger the current. Deep-sea currents The Norwegian Sea is connected with the Greenland Sea and the Arctic Ocean by the 2,600-metre deep Fram Strait. The Norwegian Sea Deep Water (NSDW) occurs at depths exceeding 2,000 metres; this homogeneous layer with a salinity of 34.91‰ experiences little exchange with the adjacent seas. Its temperature is below 0 °C and drops to −1 °C at the ocean floor. Compared with the deep waters of the surrounding seas, NSDW has more nutrients but less oxygen and is relatively old. The weak deep-water exchange with the Atlantic Ocean is due to the small depth of the relatively flat Greenland-Scotland Ridge between Scotland and Greenland, an offshoot of the Mid-Atlantic Ridge. Only four areas of the Greenland-Scotland Ridge are deeper than 500 metres: the Faroe Bank Channel (about 850 metres), some parts of the Iceland-Faroe Ridge (about 600 metres), the Wyville-Thomson Ridge (620 metres), and areas between Greenland and the Denmark Strait (850 metres) – this is much shallower than the Norwegian Sea. Cold deep water flows into the Atlantic through various channels: about 1.9 Sv through the Faroe Bank channel, 1.1 Sv through the Iceland-Faroe channel, and 0.1 Sv via the Wyville-Thomson Ridge. The turbulence that occurs when the deep water falls behind the Greenland-Scotland Ridge into the deep Atlantic basin mixes the adjacent water layers and forms the North Atlantic Deep Water, one of two major deep-sea currents providing the deep ocean with oxygen. Climate The thermohaline circulation affects the climate in the Norwegian Sea, and the regional climate can significantly deviate from average. There is also a difference of about 10 °C between the sea and the coastline. Temperatures rose between 1920 and 1960, and the frequency of storms decreased in this period. The storminess was relatively high between 1880 and 1910, decreased significantly in 1910–1960, and then recovered to the original level. In contrast to the Greenland Sea and Arctic seas, the Norwegian Sea is ice-free year round, owing to its warm currents. The convection between the relatively warm water and cold air in the winter plays an important role in the Arctic climate. The 10-degree July isotherm (air temperature line) runs through the northern boundary of the Norwegian Sea and is often taken as the southern boundary of the Arctic. In winter, the Norwegian Sea generally has the lowest air pressure in the entire Arctic and where most Icelandic Low depressions form. The water temperature in most parts of the sea is 2–7 °C in February and 8–12 °C in August. Flora and fauna The Norwegian Sea is a transition zone between boreal and Arctic conditions, and thus contains flora and fauna characteristic of both climatic regions. The southern limit of many Arctic species runs through the North Cape, Iceland, and the center of the Norwegian Sea, while the northern limit of boreal species lies near the borders of the Greenland Sea with the Norwegian Sea and Barents Sea; that is, these areas overlap. Some species like the scallop Chlamys islandica and capelin tend to occupy this area between the Atlantic and Arctic oceans. Plankton and sea bottom organisms Most of the aquatic life in the Norwegian Sea is concentrated in the upper layers. Estimates for the entire North Atlantic are that only 2% of biomass is produced at depths below 1,000 metres and only 1.2% occurs near the sea floor. The blooming of the phytoplankton is dominated by chlorophyll and peaks around 20 May. The major phytoplankton forms are diatoms, in particular the genus Thalassiosira and Chaetoceros. After the spring bloom the haptophytes of the genus Phaecocystis pouchetti become dominant. Zooplankton is mostly represented by the copepods Calanus finmarchicus and Calanus hyperboreus, where the former occurs about four times more often than the latter and is mostly found in the Atlantic streams, whereas C. hyperboreus dominates the Arctic waters; they are the main diet of most marine predators. The most important krill species are Meganyctiphanes norvegica, Thyssanoessa inermis, and Thyssanoessa longicaudata. In contrast to the Greenland Sea, there is a significant presence of calcareous plankton (Coccolithophore and Globigerinida) in the Norwegian Sea. Plankton production strongly fluctuates between years. For example, C. finmarchicus yield was 28 g/m2 (dry weight) in 1995 and only 8 g/m2 in 1997; this correspondingly affected the population of all its predators. Shrimp of the species Pandalus borealis play an important role in the diet of fish, particularly cod and blue whiting, and mostly occur at depths between 200 and 300 metres. A special feature of the Norwegian Sea is extensive coral reefs of Lophelia pertusa, which provide shelter to various fish species. Although these corals are widespread in many peripheral areas of the North Atlantic, they never reach such amounts and concentrations as at the Norwegian continental slopes. However, they are at risk due to increasing trawling, which mechanically destroys the coral reefs. Fish The Norwegian coastal waters are the most important spawning ground of the herring populations of the North Atlantic, and the hatching occurs in March. The eggs float to the surface and are washed off the coast by the northward current. Whereas a small herring population remains in the fjords and along the northern Norwegian coast, the majority spends the summer in the Barents Sea, where it feeds on the rich plankton. Upon reaching puberty, herring returns to the Norwegian Sea. The herring stock varies greatly between years. It increased in the 1920s owing to the milder climate and then collapsed in the following decades until 1970; the decrease was, however, at least partly caused by overfishing. The biomass of young hatched herring declined from 11 million tonnes in 1956 to almost zero in 1970; that affected the ecosystem not only of the Norwegian Sea but also of the Barents Sea. Enforcement of environmental and fishing regulations has resulted in partial recovery of the herring populations since 1987. This recovery was accompanied by a decline of capelin and cod stocks. While the capelin benefited from the reduced fishing, the temperature rise in the 1980s and competition for food with the herring resulted in a near disappearance of young capelin from the Norwegian Sea. Meanwhile, the elderly capelin population was quickly fished out. This also reduced the population of cod – a major predator of capelin – as the herring was still too small in numbers to replace the capelin in the cod's diet. Blue whiting (Micromesistius poutassou) has benefited from the decline of the herring and capelin stocks as it assumed the role of major predator of plankton. The blue whiting spawns near the British Isles. The sea currents carry their eggs to the Norwegian Sea, and the adults also swim there to benefit from the food supply. The young spend the summer and the winter until February in Norwegian coastal waters and then return to the warmer waters west of Scotland. The Norwegian Arctic cod mostly occurs in the Barents Sea and at the Svalbard Archipelago. In the rest of the Norwegian Sea, it is found only during the reproduction season, at the Lofoten Islands, whereas Pollachius virens and haddock spawn in the coastal waters. Mackerel is an important commercial fish. The coral reefs are populated by different species of the genus Sebastes. Mammals and birds Significant numbers of minke, humpback, sei, and orca whales are present in the Norwegian Sea, and white-beaked dolphins occur in the coastal waters. Orcas and some other whales visit the sea in the summer months for feeding; their population is closely related to the herring stocks, and they follow the herring schools within the sea. With a total population of about 110,000, minke whales are by far the most common whales in the sea. They are hunted by Norway and Iceland, with a quota of about 1,000 per year in Norway. In contrast to the past, nowadays primarily their meat is consumed, rather than fat and oil. The bowhead whale used to be a major plankton predator, but it almost disappeared from the Norwegian Sea after intense whaling in the 19th century, and was temporarily extinct in the entire North Atlantic. Similarly, the blue whale used to form large groups between Jan Mayen and Spitsbergen, but is hardly present nowadays. Observations of northern bottlenose whales in the Norwegian Sea are rare. Other large animals of the sea are hooded and harp seals and squid. Important waterfowl species of the Norwegian Sea are puffin, kittiwake and guillemot. Puffins and guillemots also suffered from the collapse of the herring population, especially the puffins on the Lofoten Islands. The latter hardly had an alternative to herring and their population was approximately halved between 1969 and 1987. Human activities Norway, Iceland, and Denmark/Faroe Islands share the territorial waters of the Norwegian Sea, with the largest part belonging to the first. Norway has claimed twelve-mile limit as territorial waters since 2004 and an exclusive economic zone of 200 miles since 1976. Consequently, due to the Norwegian islands of Svalbard and Jan Mayen, the southeast, northeast and northwest edge of the sea fall within Norway. The southwest border is shared between Iceland and Denmark/Faroe Islands. According to the Føroyingasøga, Norse settlers arrived on the islands around the 8th Century. King Harald Fairhair is credited with being the driving force to colonize these islands as well as others in the Norwegian sea. The largest damage to the Norwegian Sea was caused by extensive fishing, whaling, and pollution. Other contamination is mostly by oil and toxic substances, but also from the great number of ships sunk during the two world wars. The environmental protection of the Norwegian Sea is mainly regulated by the OSPAR Convention. Fishing and whaling Fishing has been practised near the Lofoten archipelago for hundreds of years. The coastal waters of the remote Lofoten islands are one of the richest fishing areas in Europe, as most of the Atlantic cod swims to the coastal waters of Lofoten in the winter to spawn. So in the 19th century, dried cod was one of Norway's main exports and by far the most important industry in northern Norway. Strong sea currents, maelstroms, and especially frequent storms made fishing a dangerous occupation: several hundred men died on the "Fatal Monday" in March 1821, 300 of them from a single parish, and about a hundred boats with their crews were lost within a short time in April 1875. Over the last century, the Norwegian Sea has been suffering from overfishing. In 2018, 41% of stocks were excessively harvested. Two out of sixteen of the Total Allowed Catches (TACs) agreed upon by the European Union (EU) and Norway follow scientific advice. Nine of those TACs are at least 25% above scientific advice. While the other five are set above scientific evidence when excluding landing obligation. Under the Common Fisheries Policy (CFP), the EU committed to phase out overfishing by 2015, 2020 at the absolute latest. As of 2019, the EU was reported to not be on path to achieving that goal. Whaling was also important for the Norwegian Sea. In the early 1600s, the Englishman Stephen Bennet started hunting walrus at Bear Island. In May 1607 the Muscovy Company, while looking for the Northwest Passage and exploring the sea, discovered the large populations of walrus and whales in the Norwegian Sea and started hunting them in 1610 near Spitsbergen. Later in the 17th century, Dutch ships started hunting bowhead whales near Jan Mayen; the bowhead population between Svalbard and Jan Mayen was then about 25,000 individuals. Britons and Dutch were then joined by Germans, Danes, and Norwegians. Between 1615 and 1820, the waters between Jan Mayen, Svalbard, Bear Island, and Greenland, between the Norwegian, Greenland, and Barents Seas, were the most productive whaling area in the world. However, extensive hunting had wiped out the whales in that region by the early 20th century. Sea monsters and maelstroms For many centuries, the Norwegian Sea was regarded as the edge of the known world. The disappearance of ships there, due to the natural disasters, induced legends of monsters that |
Deaths, a humanitarian aid organization Norsk Medisinaldepot, a Norwegian pharmaceutics wholesaler National Masturbation Day National Missile Defense | mediated decay, a mechanism of messenger RNA surveillance Northern Marianas descent Norwegian Maritime Directorate Naturopathic medical doctor Neuronal migration disorder |
large part by Maria Goeppert Mayer and J. Hans D. Jensen. Nuclei with certain "magic" numbers of neutrons and protons are particularly stable, because their shells are filled. Other more complicated models for the nucleus have also been proposed, such as the interacting boson model, in which pairs of neutrons and protons interact as bosons. Ab initio methods try to solve the nuclear many-body problem from the ground up, starting from the nucleons and their interactions. Much of current research in nuclear physics relates to the study of nuclei under extreme conditions such as high spin and excitation energy. Nuclei may also have extreme shapes (similar to that of Rugby balls or even pears) or extreme neutron-to-proton ratios. Experimenters can create such nuclei using artificially induced fusion or nucleon transfer reactions, employing ion beams from an accelerator. Beams with even higher energies can be used to create nuclei at very high temperatures, and there are signs that these experiments have produced a phase transition from normal nuclear matter to a new state, the quark–gluon plasma, in which the quarks mingle with one another, rather than being segregated in triplets as they are in neutrons and protons. Nuclear decay Eighty elements have at least one stable isotope which is never observed to decay, amounting to a total of about 252 stable nuclides. However, thousands of isotopes have been characterized as unstable. These "radioisotopes" decay over time scales ranging from fractions of a second to trillions of years. Plotted on a chart as a function of atomic and neutron numbers, the binding energy of the nuclides forms what is known as the valley of stability. Stable nuclides lie along the bottom of this energy valley, while increasingly unstable nuclides lie up the valley walls, that is, have weaker binding energy. The most stable nuclei fall within certain ranges or balances of composition of neutrons and protons: too few or too many neutrons (in relation to the number of protons) will cause it to decay. For example, in beta decay, a nitrogen-16 atom (7 protons, 9 neutrons) is converted to an oxygen-16 atom (8 protons, 8 neutrons) within a few seconds of being created. In this decay a neutron in the nitrogen nucleus is converted by the weak interaction into a proton, an electron and an antineutrino. The element is transmuted to another element, with a different number of protons. In alpha decay, which typically occurs in the heaviest nuclei, the radioactive element decays by emitting a helium nucleus (2 protons and 2 neutrons), giving another element, plus helium-4. In many cases this process continues through several steps of this kind, including other types of decays (usually beta decay) until a stable element is formed. In gamma decay, a nucleus decays from an excited state into a lower energy state, by emitting a gamma ray. The element is not changed to another element in the process (no nuclear transmutation is involved). Other more exotic decays are possible (see the first main article). For example, in internal conversion decay, the energy from an excited nucleus may eject one of the inner orbital electrons from the atom, in a process which produces high speed electrons but is not beta decay and (unlike beta decay) does not transmute one element to another. Nuclear fusion In nuclear fusion, two low-mass nuclei come into very close contact with each other so that the strong force fuses them. It requires a large amount of energy for the strong or nuclear forces to overcome the electrical repulsion between the nuclei in order to fuse them; therefore nuclear fusion can only take place at very high temperatures or high pressures. When nuclei fuse, a very large amount of energy is released and the combined nucleus assumes a lower energy level. The binding energy per nucleon increases with mass number up to nickel-62. Stars like the Sun are powered by the fusion of four protons into a helium nucleus, two positrons, and two neutrinos. The uncontrolled fusion of hydrogen into helium is known as thermonuclear runaway. A frontier in current research at various institutions, for example the Joint European Torus (JET) and ITER, is the development of an economically viable method of using energy from a controlled fusion reaction. Nuclear fusion is the origin of the energy (including in the form of light and other electromagnetic radiation) produced by the core of all stars including our own Sun. Nuclear fission Nuclear fission is the reverse process to fusion. For nuclei heavier than nickel-62 the binding energy per nucleon decreases with the mass number. It is therefore possible for energy to be released if a heavy nucleus breaks apart into two lighter ones. The process of alpha decay is in essence a special type of spontaneous nuclear fission. It is a highly asymmetrical fission because the four particles which make up the alpha particle are especially tightly bound to each other, making production of this nucleus in fission particularly likely. From several of the heaviest nuclei whose fission produces free neutrons, and which also easily absorb neutrons to initiate fission, a self-igniting type of neutron-initiated fission can be obtained, in a chain reaction. Chain reactions were known in chemistry before physics, and in fact many familiar processes like fires and chemical explosions are chemical chain reactions. The fission or "nuclear" chain-reaction, using fission-produced neutrons, is the source of energy for nuclear power plants and fission-type nuclear bombs, such as those detonated in Hiroshima and Nagasaki, Japan, at the end of World War II. Heavy nuclei such as uranium and thorium may also undergo spontaneous fission, but they are much more likely to undergo decay by alpha decay. For a neutron-initiated chain reaction to occur, there must be a critical mass of the relevant isotope present in a certain space under certain conditions. The conditions for the smallest critical mass require the conservation of the emitted neutrons and also their slowing or moderation so that there is a greater cross-section or probability of them initiating another fission. In two regions of Oklo, Gabon, Africa, natural nuclear fission reactors were active over 1.5 billion years ago. Measurements of natural neutrino emission have demonstrated that around half of the heat emanating from the Earth's core results from radioactive decay. However, it is not known if any of this results from fission chain reactions. Production of "heavy" elements According to the theory, as the Universe cooled after the Big Bang it eventually became possible for common subatomic particles as we know them (neutrons, protons and electrons) to exist. The most common particles created in the Big Bang which are still easily observable to us today were protons and electrons (in equal numbers). The protons would eventually form hydrogen atoms. Almost all the neutrons created in the Big Bang were absorbed into helium-4 in the first three minutes after the Big Bang, and this helium accounts for most of the helium in the universe today (see Big Bang nucleosynthesis). Some relatively small quantities of elements beyond helium (lithium, beryllium, and perhaps some boron) were created in the Big Bang, as the protons and neutrons collided with each other, but all of the "heavier elements" (carbon, element number 6, and elements of greater atomic number) that we see today, were created inside stars during a series of fusion stages, such as the proton–proton chain, the CNO cycle and the triple-alpha process. Progressively heavier elements are created during the evolution of a star. Energy is only released in fusion processes involving smaller atoms than iron because the binding energy per nucleon peaks around iron (56 nucleons). Since the creation of heavier nuclei by fusion requires energy, nature resorts to the process of neutron capture. Neutrons (due to their lack of charge) are readily absorbed by a nucleus. The heavy elements are created by either a slow neutron capture process (the so-called s-process) or the rapid, or r-process. The s process occurs in thermally pulsing stars (called AGB, or asymptotic giant branch stars) and takes hundreds to thousands of years to reach the heaviest elements of lead and bismuth. The r-process is thought to occur in supernova explosions, which provide the necessary conditions of high temperature, high neutron flux and ejected matter. These stellar conditions make | ion beams from an accelerator. Beams with even higher energies can be used to create nuclei at very high temperatures, and there are signs that these experiments have produced a phase transition from normal nuclear matter to a new state, the quark–gluon plasma, in which the quarks mingle with one another, rather than being segregated in triplets as they are in neutrons and protons. Nuclear decay Eighty elements have at least one stable isotope which is never observed to decay, amounting to a total of about 252 stable nuclides. However, thousands of isotopes have been characterized as unstable. These "radioisotopes" decay over time scales ranging from fractions of a second to trillions of years. Plotted on a chart as a function of atomic and neutron numbers, the binding energy of the nuclides forms what is known as the valley of stability. Stable nuclides lie along the bottom of this energy valley, while increasingly unstable nuclides lie up the valley walls, that is, have weaker binding energy. The most stable nuclei fall within certain ranges or balances of composition of neutrons and protons: too few or too many neutrons (in relation to the number of protons) will cause it to decay. For example, in beta decay, a nitrogen-16 atom (7 protons, 9 neutrons) is converted to an oxygen-16 atom (8 protons, 8 neutrons) within a few seconds of being created. In this decay a neutron in the nitrogen nucleus is converted by the weak interaction into a proton, an electron and an antineutrino. The element is transmuted to another element, with a different number of protons. In alpha decay, which typically occurs in the heaviest nuclei, the radioactive element decays by emitting a helium nucleus (2 protons and 2 neutrons), giving another element, plus helium-4. In many cases this process continues through several steps of this kind, including other types of decays (usually beta decay) until a stable element is formed. In gamma decay, a nucleus decays from an excited state into a lower energy state, by emitting a gamma ray. The element is not changed to another element in the process (no nuclear transmutation is involved). Other more exotic decays are possible (see the first main article). For example, in internal conversion decay, the energy from an excited nucleus may eject one of the inner orbital electrons from the atom, in a process which produces high speed electrons but is not beta decay and (unlike beta decay) does not transmute one element to another. Nuclear fusion In nuclear fusion, two low-mass nuclei come into very close contact with each other so that the strong force fuses them. It requires a large amount of energy for the strong or nuclear forces to overcome the electrical repulsion between the nuclei in order to fuse them; therefore nuclear fusion can only take place at very high temperatures or high pressures. When nuclei fuse, a very large amount of energy is released and the combined nucleus assumes a lower energy level. The binding energy per nucleon increases with mass number up to nickel-62. Stars like the Sun are powered by the fusion of four protons into a helium nucleus, two positrons, and two neutrinos. The uncontrolled fusion of hydrogen into helium is known as thermonuclear runaway. A frontier in current research at various institutions, for example the Joint European Torus (JET) and ITER, is the development of an economically viable method of using energy from a controlled fusion reaction. Nuclear fusion is the origin of the energy (including in the form of light and other electromagnetic radiation) produced by the core of all stars including our own Sun. Nuclear fission Nuclear fission is the reverse process to fusion. For nuclei heavier than nickel-62 the binding energy per nucleon decreases with the mass number. It is therefore possible for energy to be released if a heavy nucleus breaks apart into two lighter ones. The process of alpha decay is in essence a special type of spontaneous nuclear fission. It is a highly asymmetrical fission because the four particles which make up the alpha particle are especially tightly bound to each other, making production of this nucleus in fission particularly likely. From several of the heaviest nuclei whose fission produces free neutrons, and which also easily absorb neutrons to initiate fission, a self-igniting type of neutron-initiated fission can be obtained, in a chain reaction. Chain reactions were known in chemistry before physics, and in fact many familiar processes like fires and chemical explosions are chemical chain reactions. The fission or "nuclear" chain-reaction, using fission-produced neutrons, is the source of energy for nuclear power plants and fission-type nuclear bombs, such as those detonated in Hiroshima and Nagasaki, Japan, at the end of World War II. Heavy nuclei such as uranium and thorium may also undergo spontaneous fission, but they are much more likely to undergo decay by alpha decay. For a neutron-initiated chain reaction to occur, there must be a critical mass of the relevant isotope present in a certain space under certain conditions. The conditions for the smallest critical mass require the conservation of the emitted neutrons and also their slowing or moderation so that there is a greater cross-section or probability of them initiating another fission. In two regions of Oklo, Gabon, Africa, natural nuclear fission reactors were active over 1.5 billion years ago. Measurements of natural neutrino emission have demonstrated that around half of the heat emanating from the Earth's core results from radioactive decay. However, it is not known if any of this results from fission chain reactions. Production of "heavy" elements According to the theory, as the Universe cooled after the Big Bang it eventually became possible for common subatomic particles as we know them (neutrons, protons and electrons) to exist. The most common particles created in the Big Bang which are still easily observable to us today were protons and electrons (in equal numbers). The protons would eventually form hydrogen atoms. Almost all the neutrons created in the Big Bang were absorbed into helium-4 in the first three minutes after the Big Bang, and this helium accounts for most of the helium in the universe today (see Big Bang nucleosynthesis). Some relatively |
drier whereas July tends to have more rainfall. Demographics Nuremberg has been a destination for immigrants. 39.5% of the residents had an immigrant background in 2010 (counted with MigraPro). Economy Nuremberg for many people is still associated with its traditional gingerbread (Lebkuchen) products, sausages, and handmade toys. Pocket watches — Nuremberg eggs — were made here in the 16th century by Peter Henlein. Only one of the districts in the 1797-1801 sample was early industrial; the economic structure of the region around Nuremberg was dominated by metal and glass manufacturing, reflected by a share of nearly 50% handicrafts and workers. In the 19th century Nuremberg became the "industrial heart" of Bavaria with companies such as Siemens and MAN establishing a strong base in the city. Nuremberg is still an important industrial centre with a strong standing in the markets of Central and Eastern Europe. Items manufactured in the area include electrical equipment, mechanical and optical products, motor vehicles, writing and drawing paraphernalia, stationery products and printed materials. The city is also strong in the fields of automation, energy and medical technology. Siemens is still the largest industrial employer in the Nuremberg region but a good third of German market research agencies are also located in the city. The Nuremberg International Toy Fair, held at the city's exhibition centre is the largest of its kind in the world. Tourism Nuremberg is Bavaria's second largest city after Munich, and a popular tourist destination for foreigners and Germans alike. It was a leading city 500 years ago, but 90% of the town was destroyed in 1945 during the war. After World War II, many medieval-style areas of the town were rebuilt. Attractions Beyond its main attractions of the Imperial Castle, St. Lorenz Church, and Nazi Trial grounds, there are 54 different museums for arts and culture, history, science and technology, family and children, and more niche categories, where visitors can see the world's oldest globe (built in 1492), a 500-year-old Madonna, and Renaissance-era German art. There are several types of tours offered in the city, including historic tours, those that are Nazi-focused, underground and night tours, walking tours, sightseeing buses, self guided tours, and an old town tour on a mini train. Nuremberg also offers several parks and green areas, as well as indoor activities such as bowling, rock wall climbing, escape rooms, cart racing, and mini golf, theaters and cinemas, pools and thermal spas. There are also six nearby amusement parks. The city's tourism board sells the Nurnberg Card which allows for free use of public transportation and free entry to all museums and attractions in Nuremberg for a two-day period. Culinary tourism Nuremberg is also a destination for food lovers. Culinary tourists can taste the city's famous lebkuchen, gingerbread, local beer, and Nürnberger Rostbratwürstchen, or Nuremberg sausages. There are hundreds of restaurants for all tastes, including traditional franconian restaurants and beer gardens. Also offers 17 vegan and vegetarian restaurants, seven fully organic restaurants. Nuremberg also boasts a two Michelin Star rated restaurant, Essigbrätlein. Pedestrian zones Like many European cities, Nuremberg offers a pedestrian-only zone covering a large portion of the old town, which is a main destination for shopping and specialty retail, including year-round Christmas stores where tourists and locals alike can purchase Christmas ornaments, gifts, decorations, and additions to their toy Christmas villages. The Craftsmen's Courtyard, or Handwerkerhof, is another tourist shopping destination in the style of a medieval village. It houses several local family-run businesses which sell handcrafted items from glass, wood, leather, pottery, and precious metals. The Handwerkerhof is also home to traditional German restaurants and beer gardens. The Pedestrian zones of Nuremberg host festivals and markets throughout the year, most well known being Christkindlesmarkt, Germany's largest Christmas market and the gingerbread capital of the world. Visitors to the Christmas market can peruse the hundreds of stalls and purchase local wood crafts, nutcrackers, smokers, and prune people, while sampling Christmas sweets and traditional Glühwein. Hospitality In 2017, Nuremberg saw a total of 3.3 million overnight stays, a record for the town, and is expected to have surpassed that in 2018, with more growth in tourism anticipated in the coming years. There are over 175 registered places of accommodation in Nuremberg, ranging from hostels to luxury hotels, bed and breakfasts, to multi-hundred room properties. As of 19 April 2019, Nuremberg had 306 Airbnb listings. Culture Nuremberg was an early centre of humanism, science, printing, and mechanical invention. The city contributed much to the science of astronomy. In 1471 Johannes Mueller of Königsberg (Bavaria), later called Regiomontanus, built an astronomical observatory in Nuremberg and published many important astronomical charts. In 1515, Albrecht Dürer, a native of Nuremberg, created woodcuts of the first maps of the stars of the northern and southern hemispheres, producing the first printed star charts, which had been ordered by Johannes Stabius. Around 1515 Dürer also published the "Stabiussche Weltkarte", the first perspective drawing of the terrestrial globe. Printers and publishers have a long history in Nuremberg. Many of these publishers worked with well-known artists of the day to produce books that could also be considered works of art. In 1470 Anton Koberger opened Europe's first print shop in Nuremberg. In 1493, he published the Nuremberg Chronicles, also known as the World Chronicles (Schedelsche Weltchronik), an illustrated history of the world from the creation to the present day. It was written in the local Franconian dialect by Hartmann Schedel and had illustrations by Michael Wohlgemuth, Wilhelm Pleydenwurff, and Albrecht Dürer. Others furthered geographical knowledge and travel by map making. Notable among these was navigator and geographer Martin Behaim, who made the first world globe. Sculptors such as Veit Stoss, Adam Kraft and Peter Vischer are also associated with Nuremberg. Composed of prosperous artisans, the guilds of the Meistersingers flourished here. Richard Wagner made their most famous member, Hans Sachs, the hero of his opera Die Meistersinger von Nürnberg. Baroque composer Johann Pachelbel was born here and was organist of St. Sebaldus Church. The academy of fine arts situated in Nuremberg is the oldest art academy in central Europe and looks back to a tradition of 350 years of artistic education. Nuremberg is also famous for its Christkindlesmarkt (Christmas market), which draws well over a million shoppers each year. The market is famous for its handmade ornaments and delicacies. Museums Germanisches Nationalmuseum House of Albrecht Dürer Kunsthalle Nürnberg Kunstverein Nürnberg Neues Museum Nürnberg (Modern Art Museum) Nuremberg Toy Museum Nuremberg Transport Museum Performing arts The Nuremberg State Theatre, founded in 1906, is dedicated to all types of opera, ballet and stage theatre. During the season 2009/2010, the theatre presented 651 performances for an audience of 240,000 persons. The State Philharmonic Nuremberg (Staatsphilharmonie Nürnberg) is the orchestra of the State Theatre. Its name was changed in 2011 from its previous name: The Nuremberg Philharmonic (Nürnberger Philharmoniker). It is the second-largest opera orchestra in Bavaria. Besides opera performances, it also presents its own subscription concert series in the Meistersingerhalle. Christof Perick was the principal conductor of the orchestra between 2006 and 2011. Marcus Bosch heads the orchestra since September 2011. The Nuremberg Symphony Orchestra (Nürnberger Symphoniker) performs around 100 concerts a year to a combined annual audience of more than 180,000. The regular subscription concert series are mostly performed in the Meistersingerhalle but other venues are used as well, including the new concert hall of the Kongresshalle and the Serenadenhof. Alexander Shelley has been the principal conductor of the orchestra since 2009. The Nuremberg International Chamber Music Festival (Internationales Kammermusikfestival Nürnberg) takes place in early September each year, and in 2011 celebrated its tenth anniversary. Concerts take place around the city; opening and closing events are held in the medieval Burg. The Bardentreffen, an annual folk festival in Nuremberg, has been deemed the largest world music festival in Germany and takes place since 1976. 2014 the Bardentreffen starred 368 artists from 31 nations. Cuisine Nuremberg is known for Nürnberger Bratwurst, which is shorter and thinner than other bratwurst sausages. Another Nuremberg speciality is Nürnberger Lebkuchen, a kind of gingerbread eaten mainly around Christmas time. Education Nuremberg offers 51 public and 6 private elementary schools in nearly all of its districts. Secondary education is offered at 23 Mittelschulen, 12 Realschulen, and 17 Gymnasien (state, city, church, and privately owned). There are also several other providers of secondary education such as Berufsschule, Berufsfachschule, Wirtschaftsschule etc. Higher education Nuremberg hosts the joint university Friedrich-Alexander-Universität Erlangen-Nürnberg, two Fachhochschulen (Technische Hochschule Nürnberg and Evangelische Hochschule Nürnberg), a pure art academy (Akademie der Bildenden Künste Nürnberg, the first art academy in the German-speaking world) in addition to the design faculty at the TH and a music conservatoire (Hochschule für Musik Nürnberg). There are also private schools such as the Akademie Deutsche POP Nürnberg offering higher education. Main sights Nuremberg Castle: the three castles that tower over the city including central burgraves' castle, with Free Reich's buildings to the east, the Imperial castle to the west. Heilig-Geist-Spital. In the centre of the city, on the bank of the river Pegnitz, stands the Hospital of the Holy Spirit. Founded in 1332, this is one of the largest hospitals of the Middle Ages. Lepers were kept here at some distance from the other patients. It now houses elderly persons and a restaurant. The Hauptmarkt, dominated by the front of the unique Gothic Frauenkirche (Our Lady's Church), provides a picturesque setting for the famous Christmas market. A main attraction on the square is the Gothic Schöner Brunnen (Beautiful Fountain) which was erected around 1385 but subsequently replaced with a replica (the original fountain is kept in the Germanisches Nationalmuseum). The unchanged Renaissance bridge Fleischbrücke crosses the Pegnitz nearby. The Gothic Lorenzkirche (St. Laurence church) dominates the southern part of the walled city and is one of the most important buildings in Nuremberg. The main body was built around 1270–1350. The even earlier and equally impressive Sebalduskirche is St. Lorenz's counterpart in the northern part of the old city. The church of the former Katharinenkloster is preserved as a ruin, the charterhouse (Kartause) is integrated into the building of the Germanisches Nationalmuseum and the choir of the former Franziskanerkirche is part of a modern building. Other churches located inside the city walls are: St. Laurence's, Saint Clare's, Saint Martha's, Saint James the Greater's, Saint Giles's, and Saint Elisabeth's. The Germanisches Nationalmuseum is Germany's largest museum of cultural history, among its exhibits are works of famous painters such as Albrecht Dürer, Rembrandt, and Ernst Ludwig Kirchner. The Neues Museum Nürnberg is a museum for modern and contemporary | a marketplace was built over the former Jewish quarter. The plague returned to the city in 1405, 1435, 1437, 1482, 1494, 1520 and 1534. The largest growth of Nuremberg occurred in the 14th century. Charles IV's Golden Bull of 1356, naming Nuremberg as the city where newly elected kings of Germany must hold their first Imperial Diet, made Nuremberg one of the three most important cities of the Empire. Charles was the patron of the Frauenkirche, built between 1352 and 1362 (the architect was likely Peter Parler), where the Imperial court worshipped during its stays in Nuremberg. The royal and Imperial connection grew stronger in 1423 when the Holy Roman Emperor Sigismund of Luxembourg granted the Imperial regalia to be kept permanently in Nuremberg, where they remained until 1796, when the advance of French troops required their removal to Regensburg and thence to Vienna. In 1349 the members of the guilds unsuccessfully rebelled against the patricians in a ('Craftsmen's Uprising'), supported by merchants and some by councillors, leading to a ban on any self-organisation of the artisans in the city, abolishing the guilds that were customary elsewhere in Europe; the unions were then dissolved, and the oligarchs remained in power while Nuremberg was a free city (until the early-19th century). Charles IV conferred upon the city the right to conclude alliances independently, thereby placing it upon a politically equal footing with the princes of the Empire. Frequent fights took place with the burgraves – without, however, inflicting lasting damage upon the city. After fire destroyed the castle in 1420 during a feud between Frederick IV (from 1417 Margrave of Brandenburg) and the duke of Bavaria-Ingolstadt, the city purchased the ruins and the forest belonging to the castle (1427), resulting in the city's total sovereignty within its borders. Through these and other acquisitions the city accumulated considerable territory. The Hussite Wars (1419–1434), a recurrence of the Black Death in 1437, and the First Margrave War (1449–1450) led to a severe fall in population in the mid-15th century. Siding with Albert IV, Duke of Bavaria-Munich, in the Landshut War of Succession of 1503–1505 led the city to gain substantial territory, resulting in lands of , making it one of the largest Imperial cities. During the Middle Ages, Nuremberg fostered a rich, varied, and influential literary culture. Early modern age The cultural flowering of Nuremberg in the 15th and 16th centuries made it the centre of the German Renaissance. In 1525 Nuremberg accepted the Protestant Reformation, and in 1532 the Nuremberg Religious Peace was signed there, preventing war between Lutherans and Catholics for 15 years. During the Princes' 1552 revolution against Charles V, Nuremberg tried to purchase its neutrality, but Margrave Albert Alcibiades, one of the leaders of the revolt, attacked the city without a declaration of war and dictated a disadvantageous peace. At the 1555 Peace of Augsburg, the possessions of the Protestants were confirmed by the Emperor, their religious privileges extended and their independence from the Bishop of Bamberg affirmed, while the 1520s' secularisation of the monasteries was also approved. Families like the Tucher, Imhoff or Haller run trading businesses across Europe, similar to the Fugger and Welser families from Augsburg, although on a slightly smaller scale. The state of affairs in the early 16th century, increased trade routes elsewhere and the ossification of the social hierarchy and legal structures contributed to the decline in trade. During the Thirty Years' War, frequent quartering of Imperial, Swedish and League soldiers, the financial costs of the war and the cessation of trade caused irreparable damage to the city and a near-halving of the population. In 1632, the city, occupied by the forces of Gustavus Adolphus of Sweden, was besieged by the army of Imperial general Albrecht von Wallenstein. The city declined after the war and recovered its importance only in the 19th century, when it grew as an industrial centre. Even after the Thirty Years' War, however, there was a late flowering of architecture and culture – secular Baroque architecture is exemplified in the layout of the civic gardens built outside the city walls, and in the Protestant city's rebuilding of St. Egidien church, destroyed by fire at the beginning of the 18th century, considered a significant contribution to the baroque church architecture of Middle Franconia. After the Thirty Years' War, Nuremberg attempted to remain detached from external affairs, but contributions were demanded for the War of the Austrian Succession and the Seven Years' War and restrictions of imports and exports deprived the city of many markets for its manufactures. The Bavarian elector, Charles Theodore, appropriated part of the land obtained by the city during the Landshut War of Succession, to which Bavaria had maintained its claim; Prussia also claimed part of the territory. Realising its weakness, the city asked to be incorporated into Prussia but Frederick William II refused, fearing to offend Austria, Russia and France. At the Imperial diet in 1803, the independence of Nuremberg was affirmed, but on the signing of the Confederation of the Rhine on 12 July 1806, it was agreed to hand the city over to Bavaria from 8 September, with Bavaria guaranteeing the amortisation of the city's 12.5 million guilder public debt. After the Napoleonic Wars After the fall of Napoleon, the city's trade and commerce revived; the skill of its inhabitants together with its favourable situation soon made the city prosperous, particularly after its public debt had been acknowledged as a part of the Bavarian national debt. Having been incorporated into a Catholic country, the city was compelled to refrain from further discrimination against Catholics, who had been excluded from the rights of citizenship. Catholic services had been celebrated in the city by the priests of the Teutonic Order, often under great difficulties. After their possessions had been confiscated by the Bavarian government in 1806, they were given the Frauenkirche on the Market in 1809; in 1810 the first Catholic parish was established, which in 1818 numbered 1,010 souls. In 1817, the city was incorporated into the district of Rezatkreis (named for the river Franconian Rezat), which was renamed to Middle Franconia () on 1 January 1838. The first German railway, the Bavarian Ludwigsbahn, from Nuremberg to nearby Fürth, was opened in 1835. The establishment of railways and the incorporation of Bavaria into Zollverein (the 19th-century German Customs Union), commerce and industry opened the way to greater prosperity. In 1852, there were 53,638 inhabitants: 46,441 Protestants and 6,616 Catholics. It subsequently grew to become the more important industrial city of Southern Germany, one of the most prosperous towns of southern Germany, but after the Austro-Prussian War it was given to Prussia as part of their telegraph stations they had to give up. In 1905, its population, including several incorporated suburbs, was 291,351: 86,943 Catholics, 196,913 Protestants, 3,738 Jews and 3,766 members of other creeds. Nazi era Nuremberg held great significance during the Nazi Germany era. Because of the city's relevance to the Holy Roman Empire and its position in the centre of Germany, the Nazi Party chose the city to be the site of huge Nazi Party conventions – the Nuremberg rallies. The rallies were held in 1927, 1929 and annually from 1933 through 1938. After Adolf Hitler's rise to power in 1933 the Nuremberg rallies became huge Nazi propaganda events, a centre of Nazi ideals. The 1934 rally was filmed by Leni Riefenstahl, and made into a propaganda film called Triumph des Willens (Triumph of the Will). At the 1935 rally, Hitler specifically ordered the Reichstag to convene at Nuremberg to pass the Nuremberg Laws which revoked German citizenship for all Jews and other non-Aryans. A number of premises were constructed solely for these assemblies, some of which were not finished. Today many examples of Nazi architecture can still be seen in the city. The city was also the home of the Nazi propagandist Julius Streicher, the publisher of Der Stürmer. During the Second World War, Nuremberg was the headquarters of Wehrkreis (military district) XIII, and an important site for military production, including aircraft, submarines and tank engines. A subcamp of Flossenbürg concentration camp was located here, and extensively used slave labour. The city was severely damaged in Allied strategic bombing from 1943 to 1945. On 29 March 1944, the RAF endured its heaviest losses in the bombing campaign of Germany. Out of more than 700 planes participating, 106 were shot down or crash-landed on the way home to their bases, and more than 700 men were missing, as many as 545 of them dead. More than 160 became prisoners of war. On 2 January 1945, the medieval city centre was systematically bombed by the Royal Air Force and the U.S. Army Air Forces and about ninety percent of it was destroyed in only one hour, with 1,800 residents killed and roughly 100,000 displaced. In February 1945, additional attacks followed. In total, about 6,000 Nuremberg residents are estimated to have been killed in air raids. Nuremberg was a heavily fortified city that was captured in a fierce battle lasting from 17 to 21 April 1945 by the U.S. 3rd Infantry Division, 42nd Infantry Division and 45th Infantry Division, which fought house-to-house and street-by-street against determined German resistance, causing further urban devastation to the already bombed and shelled buildings. Despite this intense degree of destruction, the city was rebuilt after the war and was to some extent restored to its pre-war appearance, including the reconstruction of some of its medieval buildings. Much of this reconstructive work and conservation was done by the organisation 'Old Town Friends Nuremberg'. However, over half of the historic look of the center, and especially the northeastern half of the old Imperial Free City was not restored. Nuremberg trials Between 1945 and 1946, German officials involved in war crimes and crimes against humanity were brought before an international tribunal in the Nuremberg trials. The Soviet Union had wanted these trials to take place in Berlin. However, Nuremberg was chosen as the site for the trials for specific reasons: The city had been the location of the Nazi Party's Nuremberg rallies and the laws stripping Jews of their citizenship were passed there. There was symbolic value in making it the place of Nazi demise. The Palace of Justice was spacious and largely undamaged (one of the few that had remained largely intact despite extensive Allied bombing of Germany). The already large courtroom was reasonably easily expanded by the removal of the wall at the end opposite the bench, thereby incorporating the adjoining room. A large prison was also part of the complex. As a compromise, it was agreed that Berlin would become the permanent seat of the International Military Tribunal and that the first trial (several were planned) would take place in Nuremberg. Due to the Cold War, subsequent trials never took place. Following the trials, in October 1946, many prominent German Nazi politicians and military leaders were executed in Nuremberg. The same courtroom in Nuremberg was the venue of the Nuremberg Military Tribunals, organized by the United States as occupying power in the area. Geography Several old villages now belong to the city, for example Grossgründlach, Kraftshof, Thon, and Neunhof in the north-west; Ziegelstein in the northeast, Altenfurt and Fischbach in the south-east; and Katzwang, Kornburg in the south. Langwasser is a modern suburb. Climate Nuremberg has an oceanic climate (Köppen Cfb) with a certain humid continental influence (Dfb), categorized in the latter by the 0 °C isotherm. The city's climate is influenced by its inland position and higher altitude. Winters are changeable, with either mild or cold weather: the average temperature is around to , while summers are generally warm, mostly around at night to in the afternoon. Precipitation is evenly spread throughout the year, although February and April tend to be a bit drier whereas July tends to have more rainfall. Demographics Nuremberg has been a destination for immigrants. 39.5% of the residents had an immigrant background in 2010 (counted with MigraPro). Economy Nuremberg for many people is still associated with its traditional gingerbread (Lebkuchen) products, sausages, and handmade toys. Pocket watches — Nuremberg eggs — were made here in the 16th century by Peter Henlein. Only one of the districts in the 1797-1801 sample was early industrial; the economic structure of the region around Nuremberg was dominated by metal and glass manufacturing, reflected by a share of nearly 50% handicrafts and workers. In the 19th century Nuremberg became the "industrial heart" of Bavaria with companies such as Siemens and MAN establishing a strong base in the city. Nuremberg is still an important industrial centre with a strong standing in the markets of Central and Eastern Europe. Items manufactured in the area include electrical equipment, mechanical and optical products, motor vehicles, writing and drawing paraphernalia, stationery products and printed materials. The city is also strong in the fields of automation, energy and medical technology. Siemens is still the largest industrial employer in the Nuremberg region but a good third of German market research agencies are also located in the city. The Nuremberg International Toy Fair, held at the city's exhibition centre is the largest of its kind in the world. Tourism Nuremberg is Bavaria's second largest city after Munich, and a popular tourist destination for foreigners and Germans alike. It was a leading city 500 years ago, but 90% of the town was destroyed in 1945 during the war. After World War II, many medieval-style areas of the town were rebuilt. Attractions Beyond its main attractions of the Imperial Castle, St. Lorenz Church, and Nazi Trial grounds, there are 54 different museums for arts and culture, history, science and technology, family and children, and more niche categories, where visitors can see the world's oldest globe (built in 1492), a 500-year-old Madonna, and Renaissance-era German art. There are several types of tours offered in the city, including historic tours, those that are Nazi-focused, underground and night tours, walking tours, sightseeing buses, self guided tours, and an old town tour on a mini train. Nuremberg also offers several parks and green areas, as well as indoor activities such as bowling, rock wall climbing, escape rooms, cart racing, and mini golf, theaters and cinemas, pools and thermal spas. There are also six nearby amusement parks. The city's tourism board sells the Nurnberg Card which allows for free use of public transportation and free entry to all museums and attractions in Nuremberg for a two-day period. Culinary tourism Nuremberg is also a destination for food lovers. Culinary tourists can taste the city's famous lebkuchen, gingerbread, local beer, and Nürnberger Rostbratwürstchen, or Nuremberg sausages. There are hundreds of restaurants for all tastes, including traditional franconian restaurants and beer gardens. Also offers 17 vegan and vegetarian restaurants, seven fully organic restaurants. Nuremberg also boasts a two Michelin Star rated restaurant, Essigbrätlein. Pedestrian zones Like many European cities, Nuremberg offers a pedestrian-only zone covering a large portion of the old town, which is a main destination for shopping and specialty retail, including year-round Christmas stores where tourists and locals alike can purchase Christmas ornaments, |
deplore the way they are going about it." After graduation, he worked at the Washington Monthly, as an associate editor and then managing editor; at Texas Monthly, as an associate editor and then executive editor; at The Washington Post, as a member of the national staff; at The Atlantic Monthly, as national correspondent; and at The New Yorker, as staff writer and then Washington correspondent. On September 1, 2003, Lemann became dean of the Graduate School of Journalism at Columbia University. During Lemann's time as dean, the Journalism School launched and completed its first capital fundraising campaign, added 20 members to its full-time faculty, built a student center, started its first new professional degree program since the 1930s, and launched initiatives in investigative reporting, digital journalism, executive leadership for news organizations, and other areas. He stepped down as dean in 2013, following two five-year terms. In 2015, Lemann launched Columbia Global Reports, a university-funded publishing imprint that produces four to six ambitious works of journalism and analysis a year, each on a different underreported story in the world. From 2017 to early 2021, he was the director of Columbia World Projects. Lemann has published five books, including Transaction Man: The Rise of the Deal and the Decline of the American Dream (2019), Redemption: The Last Battle of the Civil War (2006); The Big Test: The Secret History of the American Meritocracy (1999); and The Promised Land: The Great Black Migration and How It Changed America (1991), which won several book prizes. He has written widely for such publications as The New York Times, The New York Review of Books, The New Republic, and Slate; worked in documentary television with Blackside, Inc., Frontline, the Discovery Channel, and the BBC; and lectured at many universities. Lemann serves on the boards of directors of the Authors Guild, the National Academy of Sciences’ Division of Behavioral and Social Sciences and Education, and the Academy of Political Science, and is a member of the New York Institute for the Humanities. He was named a fellow of the American Academy of Arts and Sciences in April 2010. Personal Lemann has been married | director of Columbia World Projects. Lemann has published five books, including Transaction Man: The Rise of the Deal and the Decline of the American Dream (2019), Redemption: The Last Battle of the Civil War (2006); The Big Test: The Secret History of the American Meritocracy (1999); and The Promised Land: The Great Black Migration and How It Changed America (1991), which won several book prizes. He has written widely for such publications as The New York Times, The New York Review of Books, The New Republic, and Slate; worked in documentary television with Blackside, Inc., Frontline, the Discovery Channel, and the BBC; and lectured at many universities. Lemann serves on the boards of directors of the Authors Guild, the National Academy of Sciences’ Division of Behavioral and Social Sciences and Education, and the Academy of Political Science, and is a member of the New York Institute for the Humanities. He was named a fellow of the American Academy of Arts and Sciences in April 2010. Personal Lemann has been married twice. His first wife was Dominique Alice Browning, who later became an editor in chief of House & Garden until 2007; they married on May 20, 1983, have two sons, Alexander and Theodore, and later divorced. His second wife is Judith Anne Shulevitz, a columnist for Slate, The New York Times Book Review, and The New Republic. Married on November 7, 1999, they have a son and a daughter. Bibliography Books The Big Test : The Secret History of the American Meritocracy (1999) - A History of how standardized tests, and in particular the SAT, shaped the United States in the second half of the 20th century. "None of the Above" Review, by Andrew Sullivan, The New York Times, October 24, 1999. "BOOKS OF THE TIMES; What's Wrong With the SAT and Its Elite Progeny" Review by Christopher Lehmann-Haupt, The New York Times, October 4, 1999 Redemption: The Last Battle of the Civil War (2006) - A History of Reconstruction in the South after the Civil War. "A Less Perfect Union Review by Sean Willentz, in The New York Times, September 10, 2006 First chapter of |
not. Since the Earth is not a perfect sphere but is an oblate spheroid with slightly flattened poles, a minute of latitude is not constant, but about 1861 metres at the poles and 1843 metres at the Equator. France and other metric countries state that in principle a nautical mile is an arcminute of a meridian at a latitude of 45°, but that is a modern justification for a more mundane calculation that was developed a century earlier. By the mid 19th century France had defined a nautical mile via the original 1791 definition of the metre, one ten-millionth of a quarter meridian. Thus became the metric length for a nautical mile. France made it legal for the French Navy in 1906, and many metric countries voted to sanction it for international use at the 1929 International Hydrographic Conference. Both the United States and the United Kingdom used an average arcminute, specifically, a minute of arc of a great circle of a sphere having the same surface area as the Clarke 1866 ellipsoid. The authalic (equal area) radius of the Clarke 1866 ellipsoid is . The resulting arcminute is . The United States chose five significant digits for its nautical mile, 6080.2 feet, whereas the United Kingdom chose four significant digits for its Admiralty mile, 6080 feet. In 1929, the international nautical mile was defined by the First International Extraordinary Hydrographic Conference in Monaco as exactly 1852 metres. The United States did not adopt the international nautical mile until 1954. Britain adopted it in 1970, but legal references to the obsolete unit are now converted to 1853 metres. Similar definitions The metre was originally defined as of the length of the meridian arc from the North pole | in principle a nautical mile is an arcminute of a meridian at a latitude of 45°, but that is a modern justification for a more mundane calculation that was developed a century earlier. By the mid 19th century France had defined a nautical mile via the original 1791 definition of the metre, one ten-millionth of a quarter meridian. Thus became the metric length for a nautical mile. France made it legal for the French Navy in 1906, and many metric countries voted to sanction it for international use at the 1929 International Hydrographic Conference. Both the United States and the United Kingdom used an average arcminute, specifically, a minute of arc of a great circle of a sphere having the same surface area as the Clarke 1866 ellipsoid. The authalic (equal area) radius of the Clarke 1866 ellipsoid is . The resulting arcminute is . The United States chose five significant digits for its nautical mile, 6080.2 feet, whereas the United Kingdom chose four significant digits for its Admiralty mile, 6080 feet. In 1929, the international nautical mile was defined by the First International Extraordinary Hydrographic Conference in Monaco as exactly 1852 metres. The United States did not adopt the international nautical mile until 1954. Britain adopted it in 1970, but legal references to the obsolete unit are now converted to 1853 metres. Similar definitions The metre was originally defined as of the length of the meridian arc from the North pole to the equator, thus one kilometre of distance |
was —as in Greek, Etruscan, Latin and modern languages. Use in writing systems represents a dental or alveolar nasal in virtually all languages that use the Latin alphabet, and in the International Phonetic Alphabet. A common digraph with is , which represents a velar nasal in a variety of languages, usually positioned word-finally in English. Often, before a velar plosive (as in ink or jungle), alone represents a velar nasal. In Italian and French, represents a palatal nasal . The Portuguese and Vietnamese spelling for this sound is , while Spanish, Breton, and a few other languages use the letter . In English, is generally silent when it is preceded by an at the end of words, as in hymn; however, it is pronounced in this combination when occurring word medially, as in hymnal. On the other hand, other consonants are often silent when they precede an at the beginning of an English word. Examples include gnome, knife, mnemonic, and pneumonia. is the sixth-most common letter and the second-most commonly used consonant in the English language (after ). Other uses In mathematics, the italic form n is a particularly common symbol for a | word. Examples include gnome, knife, mnemonic, and pneumonia. is the sixth-most common letter and the second-most commonly used consonant in the English language (after ). Other uses In mathematics, the italic form n is a particularly common symbol for a variable quantity which represents a natural number. The set of natural numbers is referred to as Related characters Descendants and related characters in the Latin alphabet N with diacritics: Ń ń Ñ ñ Ň ň Ǹ ǹ Ṅ ṅ Ṇ ṇ Ņ ņ Ṉ ṉ Ṋ ṋ Ꞥ ꞥ ᵰ ᶇ Phonetic alphabet symbols related to N (the International Phonetic Alphabet only uses lowercase, but uppercase forms are used in some other writing systems): Ŋ ŋ : Latin letter eng, which represents a velar nasal in the IPA Ɲ ɲ : Latin letter Ɲ, which represents a palatal nasal or an alveolo-palatal nasal in the IPA n : Superscript small n, which represents a nasal release in the IPA Ƞ ƞ : Latin letter Ƞ (encoded in |
pressure. Other dies are used to cut grooves and ridges. Wire nails were also known as "French nails" for their country of origin. Belgian wire nails began to compete in England in 1863. Joseph Henry Nettlefold was making wire nails at Smethwick by 1875. Over the following decades, the nail-making process was almost completely automated. Eventually the industry had machines capable of quickly producing huge numbers of inexpensive nails with little or no human intervention. With the introduction of cheap wire nails, the use of wrought iron for nail making quickly declined, as more slowly did the production of cut nails. In the United States, in 1892 more steel-wire nails were produced than cut nails. In 1913, 90% of manufactured nails were wire nails. Nails went from being rare and precious to being a cheap mass-produced commodity. Today almost all nails are manufactured from wire, but the term "wire nail" has come to refer to smaller nails, often available in a wider, more precise range of gauges than is typical for larger common and finish nails. Materials Nails were formerly made of bronze or wrought iron and were crafted by blacksmiths and nailors. These crafts people used a heated square iron rod that they forged before they hammered the sides which formed a point. After reheating and cutting off, the blacksmith or nailor inserted the hot nail into an opening and hammered it. Later new ways of making nails was created using machines to sheer the nails before wiggling the bar sideways to produce a shank. For example, the Type A cut nails were sheared from an iron bar type guillotine using early machinery. This method was slightly altered until the 1820s when new heads on the nails' ends were pounded via a separate mechanical nail heading machine. In the 1810s, iron bars were flipped over after each stroke while the cutter set was at an angle. Every nail was then sheared off of taper allowing for an automatic grip of each nail which also formed their heads. Type B nails were created this way. In 1886, 10 percent of the nails that were made in the United States were of the soft steel wire variety and by 1892, steel wire nails overtook iron cut nails as the main type of nails that were being produced. In 1913, wire nails were 90 percent of all nails that were produced. Today's nails are typically made of steel, often dipped or coated to prevent corrosion in harsh conditions or to improve adhesion. Ordinary nails for wood are usually of a soft, low-carbon or "mild" steel (about 0.1% carbon, the rest iron and perhaps a trace of silicon or manganese). Nails for concrete are harder, with 0.5–0.75% carbon. Types Types of nail include: Aluminum nails – Made of aluminum in many shapes and sizes for use with aluminum architectural metals Box nail – like a common nail but with a thinner shank and head Brads are small, thin, tapered, nails with a lip or projection to one side rather than a full head or a small finish nail Floor brad ('stigs') – flat, tapered and angular, for use in fixing floor boards Oval brad – Ovals utilize the principles of fracture mechanics to allow nailing without splitting. Highly anisotropic materials like regular wood (as opposed to wood composites) can easily be wedged apart. Use of an oval perpendicular to the wood's grain cuts the wood fibers rather than wedges them apart, and thus allows fastening without splitting, even close to edges Panel pins Tacks or Tintacks are short, sharp pointed nails often used with carpet, fabric and paper Normally cut from sheet steel (as opposed to wire); the tack is used in upholstery, shoe making and saddle manufacture. The triangular shape of the nail's cross section gives greater grip and less tearing of materials such as cloth and leather compared to a wire nail. Brass tack – brass tacks are commonly used where corrosion may be an issue, such as furniture where contact with human skin salts will cause corrosion on steel nails Canoe tack – A clinching (or clenching) nail. The nail point is tapered so that it can be turned back on itself using a clinching iron. It then bites back into the wood from the side opposite the nail's head, forming a rivet-like fastening. Clench-nails used in buildign clinker boats. Shoe tack – A clinching nail (see above) for clinching leather and sometimes wood, formerly used for handmade shoes. Carpet tack Upholstery tacks – used to attach coverings to furniture Thumbtack (or "push-pin" or "drawing-pin") are lightweight pins used to secure paper or cardboard. Casing nails – have a head that is smoothly tapered, in comparison to the "stepped" head of a finish nail. When used to install casing around windows or doors, they allow the wood to be pried off later with minimal damage when repairs are needed, and without the need to dent the face of the casing in order to grab and extract the nail. Once the casing has been removed, the nails can be extracted from the inner frame with any of the usual nail pullers Clout nail – a roofing nail Coil nail – nails designed for use in a pneumatic nail gun assembled in coils Common nail – smooth shank, wire nail with a heavy, flat head. The typical nail for framing Convex head (nipple head, springhead) roofing nail – an umbrella shaped head with a rubber gasket for fastening metal roofing, usually with a ring shank Copper nail – nails made of copper for use with copper flashing or slate shingles etc. D-head (clipped head) nail – a common or box nail with part of the head removed for some pneumatic nail guns Double-ended nail – a rare type of nail with points on both ends and the "head" in the middle for joining boards together. See this patent. Similar to a dowel nail but with a head on the shank. Double-headed (duplex, formwork, shutter, scaffold) nail – used for temporary nailing; nails can easily pulled for later disassembly Dowel nail – a double pointed nail without a "head" on the shank, a piece of round steel sharpened on both ends Drywall (plasterboard) nail – short, hardened, ring-shank nail with a very thin head Fiber cement nail – a nail for installing fiber cement siding Finish nail (bullet head nail, lost-head nail) – A wire nail with a small head intended to be minimally visible or driven below the wood surface and the hole filled to be invisible Gang nail – a nail plate Hardboard pin – a small nail for fixing hardboard or thin plywood, often with a square shank Horseshoe nail – nails used to hold horseshoes on hoofs Joist hanger nail – special nails rated for use with joist hangers and similar brackets. Sometimes called "Teco nails" ( × .148 shank nails used in metal connectors such as hurricane ties) Lost-head nail – see finish nail Masonry (concrete) – lengthwise fluted, hardened nail for use in concrete Oval wire nail – nails with an oval shank Panel pin Gutter spike – Large long nail intended to hold wooden gutters and some metal gutters in place at the bottom edge of a roof Ring (annular, improved, jagged) shank nail – nails that have ridges circling the shank to provide extra resistance to pulling out Roofing (clout) nail – generally a short nail with a broad head used with asphalt shingles, felt paper or the like Screw (helical) nail – a nail with a spiral shank - uses including flooring and assembling pallets Shake (shingle) nail – small headed nails to use for nailing shakes and shingles Sprig – a small nail with either a headless, tapered shank or a square shank with a head on one side. Commonly used by glaziers to fix a glass plane into a wooden frame. Square nail – a cut nail T-head nail – shaped like the letter T Veneer pin Wire (French) nail – a general term for a nail with a round shank. These are sometimes called French nails from their country of invention Wire-weld collated nail – nails held together with slender wires for use in nail guns Sizes Most countries, except the United States, use a metric system for describing nail sizes. A 50 × 3.0 indicates a nail 50 mm long (not including the head) and 3 mm in diameter. Lengths are rounded to the nearest millimetre. For example, finishing nail* sizes typically available from German suppliers are: Drahtstift mit Senkkopf (Stahl, DIN 1151) United States penny sizes In the United States, the length of a nail is designated by its penny size. Terminology Box: a wire nail with a head; box nails have a smaller shank than common nails of the same size Bright: no surface coating; not recommended for weather exposure or acidic or treated lumber Casing: a wire nail with a slightly larger head than finish nails; often used for flooring CC or Coated: "cement coated"; nail coated with adhesive, also known as cement or glue, for greater holding power; also resin- or vinyl-coated; coating melts from friction when driven to help lubricate then adheres when cool; color varies by manufacturer (tan, pink, are common) Common: a common construction wire nail with a disk-shaped head that is typically 3 to 4 times the diameter of the shank: common nails have larger shanks than box nails of the same size Cut: machine-made square nails. Now used for masonry | number of references to nails, including the story in Judges of Jael the wife of Heber, who drives a nail (or tent-peg) into the temple of a sleeping Canaanite commander; the provision of iron for nails by King David for what would become Solomon's Temple; and in connection with the crucifixion of Christ. The Romans made extensive use of nails. The Roman army, for example, left behind seven tons of nails when it evacuated the fortress of Inchtuthil in Perthshire in the United Kingdom in 86 to 87 CE. The term "penny", as it refers to nails, probably originated in medieval England to describe the price of a hundred nails. Nails themselves were sufficiently valuable and standardized to be used as an informal medium of exchange. Until around 1800 artisans known as nailers or nailors made nails by hand – note the surname Naylor. (Workmen called slitters cut up iron bars to a suitable size for nailers to work on. From the late 16th century, manual slitters disappeared with the rise of the slitting mill, which cut bars of iron into rods with an even cross-section, saving much manual effort.) At the time of the American Revolution, England was the largest manufacturer of nails in the world. Nails were expensive and difficult to obtain in the American colonies, so that abandoned houses were sometimes deliberately burned down to allow recovery of used nails from the ashes. This became such a problem in Virginia that a law was created to stop people from burning their houses when they moved. Families often had small nail-manufacturing setups in their homes; during bad weather and at night, the entire family might work at making nails for their own use and for barter. Thomas Jefferson wrote in a letter: "In our private pursuits it is a great advantage that every honest employment is deemed honorable. I am myself a nail maker." The growth of the trade in the American colonies was theoretically held back by the prohibition of new slitting mills in America by the Iron Act of 1750, though there is no evidence that the Act was actually enforced. The production of wrought-iron nails continued well into the 19th century, but ultimately was reduced to nails for purposes for which the softer cut nails were unsuitable, including horseshoe nails. Cut The slitting mill, introduced to England in 1590, simplified the production of nail rods, but the real first efforts to mechanise the nail-making process itself occurred between 1790 and 1820, initially in the United States and England, when various machines were invented to automate and speed up the process of making nails from bars of wrought iron. Also in Sweden in the early 1700s Christopher Polhem produced a nail cutting machine as part of his automated factory. These nails were known as cut nails or square nails because of their roughly rectangular cross section. Cut nails were one of the important factors in the increase in balloon framing beginning in the 1830s and thus the decline of timber framing with wooden joints. Though still used for historical renovations, and for heavy-duty applications, such as attaching boards to masonry walls, cut nails are much less common today than wire nails. The cut-nail process was patented in America by Jacob Perkins in 1795 and in England by Joseph Dyer, who set up machinery in Birmingham. The process was designed to cut nails from sheets of iron, while making sure that the fibres of the iron ran down the nails. The Birmingham industry expanded in the following decades, and reached its greatest extent in the 1860s, after which it declined due to competition from wire nails, but continued until the outbreak of World War I. Wire Wire nails are formed from wire. Usually coils of wire are drawn through a series of dies to reach a specific diameter, then cut into short rods that are then formed into nails. The nail tip is usually cut by a blade; the head is formed by reshaping the other end of the rod under high pressure. Other dies are used to cut grooves and ridges. Wire nails were also known as "French nails" for their country of origin. Belgian wire nails began to compete in England in 1863. Joseph Henry Nettlefold was making wire nails at Smethwick by 1875. Over the following decades, the nail-making process was almost completely automated. Eventually the industry had machines capable of quickly producing huge numbers of inexpensive nails with little or no human intervention. With the introduction of cheap wire nails, the use of wrought iron for nail making quickly declined, as more slowly did the production of cut nails. In the United States, in 1892 more steel-wire nails were produced than cut nails. In 1913, 90% of manufactured nails were wire nails. Nails went from being rare and precious to being a cheap mass-produced commodity. Today almost all nails are manufactured from wire, but the term "wire nail" has come to refer to smaller nails, often available in a wider, more precise range of gauges than is typical for larger common and finish nails. Materials Nails were formerly made of bronze or wrought iron and were crafted by blacksmiths and nailors. These crafts people used a heated square iron rod that they forged before they hammered the sides which formed a point. After reheating and cutting off, the blacksmith or nailor inserted the hot nail into an opening and hammered it. Later new ways of making nails was created using machines to sheer the nails before wiggling the bar sideways to produce a shank. For example, the Type A cut nails were sheared from an iron bar type guillotine using early machinery. This method was slightly altered until the 1820s when new heads on the nails' ends were pounded via a separate mechanical nail heading machine. In the 1810s, iron bars were flipped over after each stroke while the cutter set was at an angle. Every nail was then sheared off of taper allowing for an automatic grip of each nail which also formed their heads. Type B nails were created this way. In 1886, 10 percent of the nails that were made in the United States were of the soft steel wire variety and by 1892, steel wire nails overtook iron cut nails as the main type of nails that were being produced. In 1913, wire nails were 90 percent of all nails that were produced. Today's nails are typically made of steel, often dipped or coated to prevent corrosion in harsh conditions or to improve adhesion. Ordinary nails for wood are usually of a soft, low-carbon or "mild" steel (about 0.1% carbon, the rest iron and perhaps a trace of silicon or manganese). Nails for concrete are harder, with 0.5–0.75% carbon. Types Types of nail include: Aluminum nails – Made of aluminum in many shapes and sizes for use with aluminum architectural metals Box nail – like a common nail but with a thinner shank and head Brads are small, thin, tapered, nails with a lip or |
spilled over and adversely affected Namibians living in the north of the country. In 1998, Namibia Defence Force (NDF) troops were sent to the Democratic Republic of the Congo as part of a Southern African Development Community (SADC) contingent. In 1999, the national government quashed a secessionist attempt in the northeastern Caprivi Strip. The Caprivi conflict was initiated by the Caprivi Liberation Army (CLA), a rebel group led by Mishake Muyongo. It wanted the Caprivi Strip to secede and form its own society. In December 2014, Prime Minister Hage Geingob, the candidate of ruling SWAPO, won the presidential elections, taking 87% of the vote. His predecessor, President Hifikepunye Pohamba, also of SWAPO, had served the maximum two terms allowed by the constitution. In December 2019, President Hage Geingob was re-elected for a second term, taking 56.3% of the vote. Geography At , Namibia is the world's thirty-fourth largest country (after Venezuela). It lies mostly between latitudes 17° and 29°S (a small area is north of 17°), and longitudes 11° and 26°E. Being situated between the Namib and the Kalahari deserts, Namibia has the least rainfall of any country in sub-Saharan Africa. The Namibian landscape consists generally of five geographical areas, each with characteristic abiotic conditions and vegetation, with some variation within and overlap between them: the Central Plateau, the Namib, the Great Escarpment, the Bushveld, and the Kalahari Desert. The Central Plateau runs from north to south, bordered by the Skeleton Coast to the northwest, the Namib Desert and its coastal plains to the southwest, the Orange River to the south, and the Kalahari Desert to the east. The Central Plateau is home to the highest point in Namibia at Königstein elevation . The Namib is a broad expanse of hyper-arid gravel plains and dunes that stretches along Namibia's entire coastline. It varies between in width. Areas within the Namib include the Skeleton Coast and the Kaokoveld in the north and the extensive Namib Sand Sea along the central coast. The Great Escarpment swiftly rises to over . Average temperatures and temperature ranges increase further inland from the cold Atlantic waters, while the lingering coastal fogs slowly diminish. Although the area is rocky with poorly developed soils, it is significantly more productive than the Namib Desert. As summer winds are forced over the Escarpment, moisture is extracted as precipitation. The Bushveld is found in north-eastern Namibia along the Angolan border and in the Caprivi Strip. The area receives a significantly greater amount of precipitation than the rest of the country, averaging around per year. The area is generally flat and the soils sandy, limiting their ability to retain water and support agriculture. The Kalahari Desert, an arid region that extends into South Africa and Botswana, is one of Namibia's well-known geographical features. The Kalahari, while popularly known as a desert, has a variety of localised environments, including some verdant and technically non-desert areas. The Succulent Karoo is home to over 5,000 species of plants, nearly half of them endemic; approximately 10 percent of the world's succulents are found in the Karoo. The reason behind this high productivity and endemism may be the relatively stable nature of precipitation. Namibia's Coastal Desert is one of the oldest deserts in the world. Its sand dunes, created by the strong onshore winds, are the highest in the world. Because of the location of the shoreline, at the point where the Atlantic's cold water reaches Africa's hot climate, often extremely dense fog forms along the coast. Near the coast there are areas where the dune-hummocks are vegetated. Namibia has rich coastal and marine resources that remain largely unexplored. Climate Namibia extends from 17°S to 25°S latitude: climatically the range of the sub-Tropical High Pressure Belt. Its overall climate description is arid, descending from the Sub-Humid [mean rain above ] through Semi-Arid [between ] (embracing most of the waterless Kalahari) and Arid [from ] (these three regions are inland from the western escarpment) to the Hyper-Arid coastal plain [less than ]. Temperature maxima are limited by the overall elevation of the entire region: only in the far south, Warmbad for instance, are maxima above recorded. Typically the sub-Tropical High Pressure Belt, with frequent clear skies, provides more than 300 days of sunshine per year. It is situated at the southern edge of the tropics; the Tropic of Capricorn cuts the country about in half. The winter (June – August) is generally dry. Both rainy seasons occur in summer: the small rainy season between September and November, the big one between February and April. Humidity is low, and average rainfall varies from almost zero in the coastal desert to more than in the Caprivi Strip. Rainfall is highly variable, and droughts are common. In the summer of 2006/07 the rainfall was recorded far below the annual average. In May 2019, Namibia declared a state of emergency in response to the drought, and extended it by additional 6 months in October 2019. Weather and climate in the coastal area are dominated by the cold, north-flowing Benguela Current of the Atlantic Ocean, which accounts for very low precipitation ( per year or less), frequent dense fog, and overall lower temperatures than in the rest of the country. In Winter, occasionally a condition known as (German for "mountain breeze") or (Afrikaans for "east weather") occurs, a hot dry wind blowing from the inland to the coast. As the area behind the coast is a desert, these winds can develop into sand storms, leaving sand deposits in the Atlantic Ocean that are visible on satellite images. The Central Plateau and Kalahari areas have wide diurnal temperature ranges of up to 30 °C (86 °F). Efundja, the annual seasonal flooding of the northern parts of the country, often causes not only damage to infrastructure but loss of life. The rains that cause these floods originate in Angola, flow into Namibia's Cuvelai-Etosha Basin, and fill the oshanas (Oshiwambo: flood plains) there. The worst floods occurred in March 2011 and displaced 21,000 people. Water sources Namibia is the driest country in sub-Saharan Africa and depends largely on groundwater. With an average rainfall of about per annum, the highest rainfall occurs in the Caprivi in the northeast (about per annum) and decreases in a westerly and southwesterly direction to as little as and less per annum at the coast. The only perennial rivers are found on the national borders with South Africa, Angola, Zambia, and the short border with Botswana in the Caprivi. In the interior of the country, surface water is available only in the summer months when rivers are in flood after exceptional rainfalls. Otherwise, surface water is restricted to a few large storage dams retaining and damming up these seasonal floods and their run-off. Where people do not live near perennial rivers or make use of the storage dams, they are dependent on groundwater. Even isolated communities and those economic activities located far from good surface water sources, such as mining, agriculture, and tourism, can be supplied from groundwater over nearly 80% of the country. More than 100,000 boreholes have been drilled in Namibia over the past century. One third of these boreholes have been drilled dry. An aquifer called Ohangwena II, on both sides of the Angola-Namibia border, was discovered in 2012. It has been estimated to be capable of supplying a population of 800,000 people in the North for 400 years, at the current (2018) rate of consumption. Experts estimate that Namibia has of underground water. Communal Wildlife Conservancies Namibia is one of few countries in the world to specifically address conservation and protection of natural resources in its constitution. Article 95 states, "The State shall actively promote and maintain the welfare of the people by adopting international policies aimed at the following: maintenance of ecosystems, essential ecological processes, and biological diversity of Namibia, and utilisation of living natural resources on a sustainable basis for the benefit of all Namibians, both present and future." In 1993, Namibia's newly formed government received funding from the United States Agency for International Development (USAID) through its Living in a Finite Environment (LIFE) Project. The Ministry of Environment and Tourism, with financial support from organisations such as USAID, Endangered Wildlife Trust, WWF, and Canadian Ambassador's Fund, together form a Community-Based Natural Resource Management (CBNRM) support structure. The project's main goal is to promote sustainable natural resource management by giving local communities rights to wildlife management and tourism. Government and politics Namibia is a unitary semi-presidential representative democratic republic. The President of Namibia is elected to a five-year term and is both the head of state and the head of government. All members of the government are individually and collectively responsible to the legislature. The Constitution of Namibia outlines the following as the organs of the country's government: Executive: executive power is exercised by the President and the Government. Legislature: Namibia has a bicameral Parliament with the National Assembly as lower house, and the National Council as the upper house. Judiciary: Namibia has a system of courts that interpret and apply the law in the name of the state. While the constitution envisaged a multi-party system for Namibia's government, the SWAPO party has been dominant since independence in 1990. Foreign relations Namibia has a largely independent foreign policy, with persisting affiliations with states that aided the independence struggle, including Cuba. With a small army and a fragile economy, the Namibian government's principal foreign policy concern is developing strengthened ties within the Southern African region. A dynamic member of the Southern African Development Community, Namibia is a vocal advocate for greater regional integration. It became the 160th member of the UN on 23 April 1990. On its independence it became the 50th member of the Commonwealth of Nations. Military In early 2020, The Global Firepower Index (GFP) reported that Namibia's military is ranked as one of the weakest in the world, at 126th out of 137 countries. Among 34 African countries, Namibia is also poorly ranked at the 28th position. Despite this, government spending for the Ministry of Defence stood at N$5,885 million (a 1.2% decrease from the previous financial year). With close to 6 billion Namibian dollars ($411 million USD in 2021) the Ministry of Defence receives the fourth highest amount of money from Government per ministry. Namibia does not have any enemies in the region, though it has been involved in various disputes regarding borders and construction plans. The Namibian constitution defines the role of the military as "defending the territory and national interests." Namibia formed the Namibian Defence Force (NDF), comprising former enemies in a 23-year bush war: the People's Liberation Army of Namibia (PLAN) and South West African Territorial Force (SWATF). The British formulated the plan for integrating these forces and began training the NDF, which consists of a small headquarters and five battalions. The United Nations Transitional Assistance Group (UNTAG)'s Kenyan infantry battalion remained in Namibia for three months after independence to help train the NDF and to stabilise the north. According to the Namibian Defence Ministry, enlistments of both men and women will number no more than 7,500. The chief of the Namibian Defence Force is Air Vice Marshal Martin Kambulu Pinehas (with effect from 1 April 2020). In 2017, Namibia signed the UN treaty on the Prohibition of Nuclear Weapons. Administrative divisions Namibia is divided into 14 regions which are subdivided into 121 constituencies. The administrative division of Namibia is tabled by Delimitation Commissions and accepted or declined by the National Assembly. Since state foundation four Delimitation Commissions have delivered their work, the last one in 2013 under the chairmanship of Judge Alfred Siboleka. Regional councillors are directly elected through secret ballots (regional elections) by the inhabitants of their constituencies. Local authorities in Namibia can be in the form of municipalities (either Part 1 or Part 2 municipalities), town councils or villages. Human rights Homosexual acts are illegal in Namibia and discrimination, as well as intolerance, against LGBT people is still widespread, although the ban on gay sex is not enforced. Some Namibian government officials and high-profile figures, such as Namibia's Ombudsman John Walters and First Lady Monica Geingos, have called for sodomy and homosexuality to be decriminalised and are in favour of LGBT rights. In November 2018, it was reported that 32% of women aged 15–49 have experienced violence and domestic abuse from their spouses/partners and 29.5% of men believe that physical abuse towards their wife/partner is acceptable. The Namibian constitution guarantees the rights, freedoms and equal treatment of women in Namibia and SWAPO, the ruling party in Namibia, has adopted a “zebra system”, which ensures a fair balance of both genders in government and equal representation of women in the Namibian government. Namibia is considered one of the most free and democratic countries in Africa, with a government that maintains and protects basic human rights and freedoms. Economy Namibia's economy is tied closely to South Africa’s due to their shared history. The largest economic sectors are mining (10.4% of the gross domestic product in 2009), agriculture (5.0%), manufacturing (13.5%), and tourism. Namibia has a highly developed banking sector with modern infrastructures, such as online banking and cellphone banking. The Bank of Namibia (BoN) is the central bank of Namibia responsible for performing all other functions ordinarily performed by a central bank. There are 5 BoN authorised commercial banks in Namibia: Bank Windhoek, First National Bank, Nedbank, Standard Bank and Small and Medium Enterprises Bank. According to the Namibia Labour Force Survey Report 2012, conducted by the Namibia Statistics Agency, the country's unemployment rate is 27.4%. "Strict unemployment" (people actively seeking a full-time job) stood at 20.2% in 2000, 21.9% in 2004 and spiralled to 29.4% in 2008. Under a broader definition (including people who have given up searching for employment) unemployment rose to 36.7% in 2004. This estimate considers people in the informal economy as employed. Labour and Social Welfare Minister Immanuel Ngatjizeko praised the 2008 study as "by far superior in scope and quality to any that has been available previously", but its methodology has also received criticism. In 2004 a labour act was passed to protect people from job discrimination stemming from pregnancy and HIV/AIDS status. In early 2010 the Government tender board announced that "henceforth 100 per cent of all unskilled and semi-skilled labour must be sourced, without exception, from within Namibia". In 2013, global business and financial news provider, Bloomberg, named Namibia the top emerging market economy in Africa and the 13th best in the world. Only four African countries made the Top 20 Emerging Markets list in the March 2013 issue of Bloomberg Markets magazine, and Namibia was rated ahead of Morocco (19th), South Africa (15th), and Zambia (14th). Worldwide, Namibia also fared better than Hungary, Brazil, and Mexico. Bloomberg Markets magazine ranked the top 20 based on more than a dozen criteria. The data came from Bloomberg's own financial-market statistics, IMF forecasts and the World Bank. The countries were also rated on areas of particular interest to foreign investors: the ease of doing business, the perceived level of corruption and economic freedom. To attract foreign investment, the government has made improvement in reducing red tape resulted from excessive government regulations, making Namibia one of the least bureaucratic places to do business in the region. Facilitation payments are occasionally demanded by customs due to cumbersome and costly customs procedures. Namibia is also classified as an Upper Middle Income country by the World Bank, and ranks 87th out of 185 economies in terms of ease of doing business. The cost of living in Namibia is relatively high because most goods, including cereals, need to be imported. Its capital city, Windhoek, is the 150th most expensive place in the world for expatriates to live. Taxation in Namibia includes personal income tax, which is applicable to the total taxable income of an individual. All individuals are taxed at progressive marginal rates over a series of income brackets. The value-added tax (VAT) is applicable to most of the commodities and services. Despite the remote nature of much of the country, Namibia has seaports, airports, highways, and railways (narrow-gauge). It seeks to become a regional transportation hub; it has an important seaport and several landlocked neighbours. The Central Plateau already serves as a transportation corridor from the more densely populated north to South Africa, the source of four-fifths of Namibia's imports. Income disparity Namibia is a country with a substantial income disparity. The data indicates that the current income share held by the highest 10% is approximately 51.8%. This disparity illustrates the large gap between the rich and the poor. An additional figure describes the poverty gap: people living on US$2 or less in the country are approximately 17.72% of the population. Agriculture About half of the population depends on agriculture (largely subsistence agriculture) for its livelihood, but Namibia must still import some of its food. Although per capita GDP is five times the per capita GDP of Africa's poorest countries, the majority of Namibia's people live in rural areas and have a subsistence way of life. Namibia has one of the highest rates of income inequality in the world, due in part to the fact that there is an urban economy and a more rural cashless economy. The inequality figures thus take into account people who do not actually rely on the formal economy for their survival. Although arable land accounts for <1% of Namibia, (about .97%), nearly half of the population is employed in agriculture. About 4,000, mostly white, commercial farmers own almost half of Namibia's arable land. The United Kingdom offered about $180,000 in 2004 to help finance Namibia's land reform process, as Namibia plans to start expropriating land from white farmers to resettle landless black Namibians. Germany has offered €1.1bn in 2021 over 30 years in reparations for the genocides in the early 20th century but the money will go towards infrastructure, healthcare and training programmes not land reform. An agreement has been reached on the privatisation of several more enterprises in coming years, with hopes that this will stimulate much needed foreign investment, but reinvestment of environmentally derived capital has hobbled Namibian per capita income. One of the fastest growing areas of economic development in Namibia is the growth of wildlife conservancies. These are particularly important to the rural, generally unemployed, population. Mining and electricity Providing 25% of Namibia's revenue, mining is the single most important contributor to the economy. Namibia is the fourth largest exporter of non-fuel minerals in Africa and the world's fourth largest producer of uranium. There has been significant investment in uranium mining and Namibia is set to become the largest exporter of uranium by 2015. Rich alluvial diamond deposits make Namibia a primary source for gem-quality diamonds. While Namibia is known predominantly for its gem diamond and uranium deposits, a number of other minerals are extracted industrially such as lead, tungsten, gold, tin, fluorspar, manganese, marble, copper and zinc. There are offshore gas deposits in the Atlantic Ocean that are planned to be extracted in the future. According to "The Diamond Investigation", a book about the global diamond market, from 1978, De Beers, the largest diamond company, bought most of the Namibian diamonds, and would continue to do so, because "whatever government eventually comes to power they will need this revenue to survive". Domestic supply voltage is 220 V AC. Electricity is generated mainly by thermal and hydroelectric power plants. Non-conventional methods of electricity generation also play some role. Encouraged by the rich uranium deposits the Namibian government plans to erect its first nuclear power station by 2018, also uranium enrichment is envisaged to happen locally. Diamonds Although much of the world's diamond supply comes from what have been called African blood diamonds, Namibia has managed to develop a diamond mining industry largely free of the kinds of conflict, extortion, and murder that have plagued many other African nations with diamond mines. This has been attributed to political dynamics, economic institutions, grievances, political geography, and the effects of neighbourhoods, and is the result of a joint agreement between the government and De Beers that has led to a taxable base, strengthening state institutions. Tourism Tourism is a major contributor (14.5%) to Namibia's GDP, creating tens of thousands of jobs (18.2% of all employment) directly or indirectly and servicing over a million tourists per year. The country is a prime destination in Africa and is known for ecotourism, which features Namibia's extensive wildlife. There are many lodges and reserves to accommodate ecotourists. Sport and trophy hunting is also a large and growing component of the Namibian economy, accounting for 14% of total tourism in the year 2000, or 19.6 million U.S. dollars, with Namibia boasting numerous species sought after by international sport hunters. In addition, extreme sports such as sandboarding, skydiving and 4x4ing have become popular, and many cities have companies that provide tours. The most visited places include the capital city of Windhoek, Caprivi Strip, Fish River Canyon, Sossusvlei, the Skeleton Coast Park, Sesriem, Etosha Pan and the coastal towns of Swakopmund, Walvis Bay and Lüderitz. Windhoek plays a very important role in Namibia's tourism due to its central location and close proximity to Hosea Kutako International Airport. According to The Namibia Tourism Exit Survey, which was produced by the Millennium Challenge Corporation for the Namibian Directorate of Tourism, 56% of all tourists visiting Namibia in 2012–13 visited Windhoek. Many of Namibia's tourism related parastatals and governing bodies such as Namibia Wildlife Resorts and the Namibia Tourism Board as well as Namibia's tourism-related trade associations such as the Hospitality Association of Namibia are headquartered in Windhoek. There are also a number of notable hotels in Windhoek, such as Windhoek Country Club Resort, and some international hotel chains, such as Hilton Hotels and Resorts. Namibia's primary tourism related governing body, the Namibia Tourism Board (NTB), was established by an Act of Parliament: the Namibia Tourism Board Act, 2000 (Act 21 of 2000). Its primary objectives are to regulate the tourism industry and to market Namibia as a tourist destination. There are also a number of trade associations that represent the tourism sector in Namibia, such as the Federation of Namibia Tourism Associations (the umbrella body for all tourism associations in Namibia), the Hospitality Association of Namibia, the Association of Namibian Travel Agents, Car Rental Association of Namibia and the Tour and Safari Association of Namibia. Water supply and sanitation Namibia is the only country in Sub-Saharan Africa to provide water through municipal departments. The only bulk water supplier in Namibia is NamWater, which sells it to the respective municipalities which in turn deliver it through their reticulation networks. In rural areas, the Directorate of Rural Water Supply in the Ministry of Agriculture, Water and Forestry is in charge of drinking water supply. The UN evaluated in 2011 that Namibia has improved its water access network significantly since independence in 1990. A large part of the population can not, however, make use of these resources due to the prohibitively high consumption cost and the long distance between residences and water points in rural areas. As a result, many Namibians prefer the traditional wells over the available water points far away. Compared to the efforts made to improve access to safe water, Namibia is lagging behind in the provision of adequate sanitation. This includes 298 schools that have no toilet facilities. Over 50% of child deaths are related to lack of water, sanitation, or hygiene; 23% are due to diarrhoea alone. The UN has identified a "sanitation crisis" in the country. Apart from residences for upper and middle class households, sanitation is insufficient in most residential areas. Private flush toilets are too expensive for virtually all residents in townships due to their water consumption and installation cost. As a result, access to improved sanitation has not increased much since independence: in Namibia's rural areas 13% of the population had more than basic sanitation, up from 8% in 1990. Many of Namibia's inhabitants have to resort to "flying toilets", plastic bags to defecate into, which after use are flung into the bush. The use of open areas close to residential land for urination and defecation is very common and has been identified as a major health hazard. Demographics Namibia has the second-lowest population density of any sovereign country, after Mongolia. In 2017 there were on average 3.08 people per km2. The total fertility rate in 2015 was 3.47 children per woman according to the UN. Ethnic groups The majority of the Namibian population is of Bantu-speaking origin—mostly of the Ovambo ethnicity, which forms about half of the population—residing mainly in the north of the country, although many are now resident in towns throughout Namibia. Other ethnic groups are the Herero and Himba people, who speak a similar language, and the Damara, who, like the Nama, speak Khoekhoe. In addition to the Bantu majority, there are large groups of Khoisan (such as Nama and San), who are descendants of the original inhabitants of Southern Africa. The country also contains some descendants of refugees from Angola. There are also two smaller groups of people with mixed racial origins, called "Coloureds" and "Basters", who together make up 8.0% (with the Coloureds outnumbering the Basters two to one). There is a substantial Chinese minority in Namibia; it stood at 40,000 in 2006. Whites (mainly of Afrikaner, German, British and Portuguese origin) makeup between 4.0 and 7.0% of the population. Although their proportion of the population decreased after independence due to emigration and lower birth rates, they still form the second-largest population of European ancestry, both in terms of percentage and actual numbers, in Sub-Saharan Africa (after South Africa). The majority of Namibian whites and nearly all those who are of mixed race speak Afrikaans and share similar origins, culture, and religion as the white and coloured populations of South Africa. A large minority of whites (around 30,000) trace their family origins back to the German settlers who colonised Namibia prior to the British confiscation of German lands after World War I, and they maintain German cultural and educational institutions. Nearly all Portuguese settlers came to the country from the former Portuguese colony of Angola. The 1960 census reported 526,004 persons in what was then South West Africa, including 73,464 whites (14%). Censuses Namibia conducts a census every ten years. After independence the first Population and Housing Census was carried out in 1991; further rounds followed in 2001 and 2011. The data collection method is to count every person resident in | areas of economic development in Namibia is the growth of wildlife conservancies. These are particularly important to the rural, generally unemployed, population. Mining and electricity Providing 25% of Namibia's revenue, mining is the single most important contributor to the economy. Namibia is the fourth largest exporter of non-fuel minerals in Africa and the world's fourth largest producer of uranium. There has been significant investment in uranium mining and Namibia is set to become the largest exporter of uranium by 2015. Rich alluvial diamond deposits make Namibia a primary source for gem-quality diamonds. While Namibia is known predominantly for its gem diamond and uranium deposits, a number of other minerals are extracted industrially such as lead, tungsten, gold, tin, fluorspar, manganese, marble, copper and zinc. There are offshore gas deposits in the Atlantic Ocean that are planned to be extracted in the future. According to "The Diamond Investigation", a book about the global diamond market, from 1978, De Beers, the largest diamond company, bought most of the Namibian diamonds, and would continue to do so, because "whatever government eventually comes to power they will need this revenue to survive". Domestic supply voltage is 220 V AC. Electricity is generated mainly by thermal and hydroelectric power plants. Non-conventional methods of electricity generation also play some role. Encouraged by the rich uranium deposits the Namibian government plans to erect its first nuclear power station by 2018, also uranium enrichment is envisaged to happen locally. Diamonds Although much of the world's diamond supply comes from what have been called African blood diamonds, Namibia has managed to develop a diamond mining industry largely free of the kinds of conflict, extortion, and murder that have plagued many other African nations with diamond mines. This has been attributed to political dynamics, economic institutions, grievances, political geography, and the effects of neighbourhoods, and is the result of a joint agreement between the government and De Beers that has led to a taxable base, strengthening state institutions. Tourism Tourism is a major contributor (14.5%) to Namibia's GDP, creating tens of thousands of jobs (18.2% of all employment) directly or indirectly and servicing over a million tourists per year. The country is a prime destination in Africa and is known for ecotourism, which features Namibia's extensive wildlife. There are many lodges and reserves to accommodate ecotourists. Sport and trophy hunting is also a large and growing component of the Namibian economy, accounting for 14% of total tourism in the year 2000, or 19.6 million U.S. dollars, with Namibia boasting numerous species sought after by international sport hunters. In addition, extreme sports such as sandboarding, skydiving and 4x4ing have become popular, and many cities have companies that provide tours. The most visited places include the capital city of Windhoek, Caprivi Strip, Fish River Canyon, Sossusvlei, the Skeleton Coast Park, Sesriem, Etosha Pan and the coastal towns of Swakopmund, Walvis Bay and Lüderitz. Windhoek plays a very important role in Namibia's tourism due to its central location and close proximity to Hosea Kutako International Airport. According to The Namibia Tourism Exit Survey, which was produced by the Millennium Challenge Corporation for the Namibian Directorate of Tourism, 56% of all tourists visiting Namibia in 2012–13 visited Windhoek. Many of Namibia's tourism related parastatals and governing bodies such as Namibia Wildlife Resorts and the Namibia Tourism Board as well as Namibia's tourism-related trade associations such as the Hospitality Association of Namibia are headquartered in Windhoek. There are also a number of notable hotels in Windhoek, such as Windhoek Country Club Resort, and some international hotel chains, such as Hilton Hotels and Resorts. Namibia's primary tourism related governing body, the Namibia Tourism Board (NTB), was established by an Act of Parliament: the Namibia Tourism Board Act, 2000 (Act 21 of 2000). Its primary objectives are to regulate the tourism industry and to market Namibia as a tourist destination. There are also a number of trade associations that represent the tourism sector in Namibia, such as the Federation of Namibia Tourism Associations (the umbrella body for all tourism associations in Namibia), the Hospitality Association of Namibia, the Association of Namibian Travel Agents, Car Rental Association of Namibia and the Tour and Safari Association of Namibia. Water supply and sanitation Namibia is the only country in Sub-Saharan Africa to provide water through municipal departments. The only bulk water supplier in Namibia is NamWater, which sells it to the respective municipalities which in turn deliver it through their reticulation networks. In rural areas, the Directorate of Rural Water Supply in the Ministry of Agriculture, Water and Forestry is in charge of drinking water supply. The UN evaluated in 2011 that Namibia has improved its water access network significantly since independence in 1990. A large part of the population can not, however, make use of these resources due to the prohibitively high consumption cost and the long distance between residences and water points in rural areas. As a result, many Namibians prefer the traditional wells over the available water points far away. Compared to the efforts made to improve access to safe water, Namibia is lagging behind in the provision of adequate sanitation. This includes 298 schools that have no toilet facilities. Over 50% of child deaths are related to lack of water, sanitation, or hygiene; 23% are due to diarrhoea alone. The UN has identified a "sanitation crisis" in the country. Apart from residences for upper and middle class households, sanitation is insufficient in most residential areas. Private flush toilets are too expensive for virtually all residents in townships due to their water consumption and installation cost. As a result, access to improved sanitation has not increased much since independence: in Namibia's rural areas 13% of the population had more than basic sanitation, up from 8% in 1990. Many of Namibia's inhabitants have to resort to "flying toilets", plastic bags to defecate into, which after use are flung into the bush. The use of open areas close to residential land for urination and defecation is very common and has been identified as a major health hazard. Demographics Namibia has the second-lowest population density of any sovereign country, after Mongolia. In 2017 there were on average 3.08 people per km2. The total fertility rate in 2015 was 3.47 children per woman according to the UN. Ethnic groups The majority of the Namibian population is of Bantu-speaking origin—mostly of the Ovambo ethnicity, which forms about half of the population—residing mainly in the north of the country, although many are now resident in towns throughout Namibia. Other ethnic groups are the Herero and Himba people, who speak a similar language, and the Damara, who, like the Nama, speak Khoekhoe. In addition to the Bantu majority, there are large groups of Khoisan (such as Nama and San), who are descendants of the original inhabitants of Southern Africa. The country also contains some descendants of refugees from Angola. There are also two smaller groups of people with mixed racial origins, called "Coloureds" and "Basters", who together make up 8.0% (with the Coloureds outnumbering the Basters two to one). There is a substantial Chinese minority in Namibia; it stood at 40,000 in 2006. Whites (mainly of Afrikaner, German, British and Portuguese origin) makeup between 4.0 and 7.0% of the population. Although their proportion of the population decreased after independence due to emigration and lower birth rates, they still form the second-largest population of European ancestry, both in terms of percentage and actual numbers, in Sub-Saharan Africa (after South Africa). The majority of Namibian whites and nearly all those who are of mixed race speak Afrikaans and share similar origins, culture, and religion as the white and coloured populations of South Africa. A large minority of whites (around 30,000) trace their family origins back to the German settlers who colonised Namibia prior to the British confiscation of German lands after World War I, and they maintain German cultural and educational institutions. Nearly all Portuguese settlers came to the country from the former Portuguese colony of Angola. The 1960 census reported 526,004 persons in what was then South West Africa, including 73,464 whites (14%). Censuses Namibia conducts a census every ten years. After independence the first Population and Housing Census was carried out in 1991; further rounds followed in 2001 and 2011. The data collection method is to count every person resident in Namibia on the census reference night, wherever they happen to be. This is called the de facto method. For enumeration purposes the country is demarcated into 4,042 enumeration areas. These areas do not overlap with constituency boundaries to get reliable data for election purposes as well. The 2011 Population and Housing Census counted 2,113,077 inhabitants. Between 2001 and 2011 the annual population growth was 1.4%, down from 2.6% in the previous ten-year period. Urban settlements Namibia has 13 cities, governed by municipalities and 26 towns, governed by town councils. The capital Windhoek is by far the largest urban settlement in Namibia. Religion The Christian community makes up 80%–90% of the population of Namibia, with at least 75% being Protestant, of which at least 50% are Lutheran. Lutherans are the largest religious group, a legacy of the German and Finnish missionary work during the country's colonial times. 10%–20% of the population hold indigenous beliefs. Missionary activities during the second half of the 19th century resulted in many Namibians converting to Christianity. Today most Christians are Lutheran, but there also are Roman Catholic, Methodist, Anglican, African Methodist Episcopal, Dutch Reformed, Latter-day Saints and Jehovah's Witnesses . Islam in Namibia is subscribed to by about 9,000 people, many of them Nama. Namibia is home to a small Jewish community of about 100 people. Languages Up to 1990, English, German, and Afrikaans were official languages. Long before Namibia's independence from South Africa, SWAPO was of the opinion that the country should become officially monolingual, choosing this approach in contrast to that of its neighbour South Africa (which granted all 11 of its major languages official status), which it saw as "a deliberate policy of ethnolinguistic fragmentation." Consequently, SWAPO instituted English as Namibia's sole official language, though only about 3% of the population speaks it as a home language. Its implementation is focused on the civil service, education and the broadcasting system, especially the state broadcaster NBC. Some other languages have received semi-official recognition by being allowed as medium of instruction in primary schools. Private schools are expected to follow the same policy as state schools, and "English language" is a compulsory subject. Some critics argue that, as in other postcolonial African societies, the push for monolingual instruction and policy has resulted in a high rate of school drop-outs and of individuals whose academic competence in any language is low. According to the 2011 census, the most common languages are Oshiwambo (the most spoken language for 49% of households), Khoekhoegowab (11.3%), Afrikaans (10.4%), RuKwangali (9%), and Otjiherero (9%). The most widely understood national language is Afrikaans, the country's lingua franca. Both Afrikaans and English are used primarily as a second language reserved for public communication. A complete list of languages according to the 2011 census is 48.9% Oshiwambo, 11.3% Khoekhoegowab, 10.4% Afrikaans, 8.6% Otjiherero, 8.5% RuKwangali, 4.8% siLozi, 3.4% English, 1.2% Other African Languages, 0.9% German, 0.8% San, 0.7% Other European Languages, 0.3% Setswana, and 0.1% Asian Languages. Most of the white population speaks either German or Afrikaans. Even today, years after the end of the German colonial era, German plays a role as a commercial language. Afrikaans is spoken by 60% of the white community, German by 32%, English by 7% and Portuguese by 4–5%. Geographical proximity to Portuguese-speaking Angola explains the relatively high number of Portuguese speakers; in 2011 these were estimated to be 100,000, or 4–5% of the total population. Health Life expectancy at birth is estimated to be 64 years in 2017 – among the lowest in the world. Namibia launched a National Health Extension Programme in 2012 deployment 1,800 (2015) of a total ceiling of 4,800 health extension workers trained for six months in community health activities including first aid, health promotion for disease prevention, nutritional assessment and counseling, water sanitation and hygiene practices, HIV testing and community-based antiretroviral treatment. Namibia faces a non-communicable disease burden. The Demographic and Health Survey (2013) summarises findings on elevated blood pressure, hypertension, diabetes, and obesity: Among eligible respondents age 35–64, more than 4 in 10 women (44 percent) and men (45 percent) have elevated blood pressure or are currently taking medicine to lower their blood pressure. Forty-nine percent of women and 61 percent of men are not aware that they have elevated blood pressure. Forty-three percent of women and 34 percent of men with hypertension are taking medication for their condition. Only 29 percent of women and 20 percent of men with hypertension are taking medication and have their blood pressure under control. Six percent of women and 7 percent of men are diabetic; that is, they have elevated fasting plasma glucose values or report that they are taking diabetes medication. An additional 7 percent of women and 6 percent of men are prediabetic. Sixty-seven percent of women and 74 percent of men with diabetes are taking medication to lower their blood glucose. Women and men with a higher-than-normal body mass index (25.0 or higher) are more likely to have elevated blood pressure and elevated fasting blood glucose. The HIV epidemic remains a public health issue in Namibia despite significant achievements made by the Ministry of Health and Social Services to expand HIV treatment services. In 2001, there were an estimated 210,000 people living with HIV/AIDS, and the estimated death toll in 2003 was 16,000. According to the 2011 UNAIDS Report, the epidemic in Namibia "appears to be leveling off." As the HIV/AIDS epidemic has reduced the working-aged population, the number of orphans has increased. |
has rich coastal and marine resources that remain largely unexplored. Weather and climate Namibia has more than 300 days of sunshine per year. It is situated at the southern edge of the tropics; the Tropic of Capricorn cuts the country about in half. The winter (June–August) is generally dry, both rainy seasons occur in summer, the small rainy season between September and November, the big one between February and April. Humidity is low, and average rainfall varies from almost zero in the coastal desert to more than 600 mm in the Caprivi Strip. Rainfall is however highly variable, and droughts are common. A bad rainy season occurred in summer 2006/07. Very low rainfall were recorded in 2019. Due to the dry winters snowfall has a very rare occurrence and prompts media coverage whenever it happens. The snow was reported at Spreetshoogte Pass in the Namib-Naukluft Park in June 2011. Weather and climate in the coastal area are dominated by the cold, north-flowing Benguela Current of the Atlantic Ocean which accounts for very low precipitation (50 mm per year or less), frequent dense fog, and overall lower temperatures than in the rest of the country. In winter, occasionally a condition known as Berg wind or Oosweer (Afrikaans: East weather) occurs, a hot dry wind blowing from the inland to the coast. As the area behind the coast is a desert, these winds can develop into sand storms with sand deposits in the Atlantic Ocean visible on satellite images. The Central Plateau and Kalahari areas have wide diurnal temperature ranges of up to 30C. Water sources Namibia is the driest country in sub-Saharan Africa and depends largely on groundwater. With an average rainfall of about per annum, the highest rainfall occurs in the Caprivi in the northeast (about per annum) and decreases in a westerly and southwesterly direction to as little as and less per annum at the coast. The only perennial rivers are found on the national borders with South Africa, Angola, Zambia, and the short border with Botswana in the Caprivi. In the interior of the country, surface water is available only in the summer months when rivers are in flood after exceptional rainfalls. Otherwise, surface water is restricted to a few large storage dams retaining and damming up these seasonal floods and their runoff. Where people do not live near perennial rivers or make use of the storage dams, they are dependent on groundwater. Even isolated communities and those economic activities located far from good surface water sources, such as mining, agriculture, and tourism, can be supplied from groundwater over nearly 80% of the country. The longest river in Namibia is the Fish River with a length of . More than 120,000 boreholes have been drilled in Namibia over the past century. One third of these boreholes have been drilled dry. An aquifer called "Ohangwena II", located on both sides of the Angola-Namibia border, was discovered in 2012. This aquifer has been estimated to be capable of supplying the 800,000 people in the North for 400 years, at the current (2018) rate of consumption. Experts estimate that Namibia has of underground water. Efundja, the annual flooding of the northern parts of the country, often causes not only damage to infrastructure but loss of life. The rains that cause these floods originate in Angola, flow into Namibia's Cuvelai basin, and fill the Oshanas (Oshiwambo: flood plains) there. The worst floods occurred in March 2011 and displaced 21,000 people. Urbanization The capital and largest city, Windhoek, is in the centre of the country. It is home to the country's Central Administrative Region, Windhoek Hosea Kutako International Airport and the country's railhead. Other important towns are: Arandis, uranium mine Walvis Bay, sea port, international airport, railhead Oshakati, main business centre in the North, railhead Otjiwarongo, main business centre in Central-North, rail junction Lüderitz, sea port, railhead Gobabis, farming centre Keetmanshoop, railhead Tsumeb, mining Swakopmund, Tourism (Ex German Colonial town) Rundu, Katima Mulilo Okahandja Statistics Location: Southern Africa, bordering the South Atlantic Ocean, between Angola and South Africa Geographic coordinates: Area: total: 824,292 km² land: 823,290 km² water: 1,002 km² Land boundaries: total: 4,220 km border countries: Angola 1,427 km, Botswana 1,544 km, South Africa 1,005 km, Zambia 244 km Coastline: 1,572 km Maritime claims: territorial sea: contiguous zone: exclusive economic zone: and Terrain: Mostly high plateau; Namib Desert along coast; Kalahari Desert in east. In the north near the border with Angola there is a flat area that has been designated by the World Wildlife Fund as part of the Angolan mopane woodlands ecoregion. Elevation extremes: lowest point: Atlantic Ocean 0 m mean elevation: 1,414 m highest point: Königstein 2,573 m Natural resources: diamonds, copper, uranium, gold, silver, lead, tin, lithium, cadmium, tungsten, zinc, salt, hydropower, fish note: suspected deposits of oil, coal, and iron ore Land use: agricultural land: 47.2% (2018) arable land: 1% (2018) permanent crops: 0% (2018) permanent pasture: 46.2% (2018) forest: 8.8% (2018) other: 44% (2018) Irrigated land: 80 km2 (2012), 75.73 km2 (2003), 70 km² (1998 est.), 60 km² (1993 est.) Total renewable water resources: 17.72 km3 (2011) Natural hazards: prolonged periods of drought Environment - current issues: depletion and degradation of water and aquatic resources; desertification; land degradation; loss of biodiversity and biotic resources; wildlife poaching Environment - international agreements: party to: Antarctic-Marine Living Resources, Biodiversity, Climate Change, Climate Change-Kyoto | that create microclimates and habitat for organisms not adapted to life in the surrounding desert matrix. Coastal Desert Namibia's Coastal Desert is one of the oldest deserts in the world. Its sand dunes, created by the strong onshore winds, are the highest in the world. The Namib Desert and the Namib-Naukluft National Park is located here. The Namibian coastal deserts are the richest source of diamonds on earth, making Namibia the world's largest producer of diamonds. It is divided into the northern Skeleton Coast and the southern Diamond Coast. Because of the location of the shoreline—at the point where the Atlantic's cold water reaches Africa—there is often extremely dense fog. Sandy beach comprises 54% and mixed sand and rock add another 28%. Only 16% of the total length is rocky shoreline. The coastal plains are dune fields, gravel plains covered with lichen and some scattered salt pans. Near the coast there are areas where the dunes are vegetated with hammocks. Namibia has rich coastal and marine resources that remain largely unexplored. Weather and climate Namibia has more than 300 days of sunshine per year. It is situated at the southern edge of the tropics; the Tropic of Capricorn cuts the country about in half. The winter (June–August) is generally dry, both rainy seasons occur in summer, the small rainy season between September and November, the big one between February and April. Humidity is low, and average rainfall varies from almost zero in the coastal desert to more than 600 mm in the Caprivi Strip. Rainfall is however highly variable, and droughts are common. A bad rainy season occurred in summer 2006/07. Very low rainfall were recorded in 2019. Due to the dry winters snowfall has a very rare occurrence and prompts media coverage whenever it happens. The snow was reported at Spreetshoogte Pass in the Namib-Naukluft Park in June 2011. Weather and climate in the coastal area are dominated by the cold, north-flowing Benguela Current of the Atlantic Ocean which accounts for very low precipitation (50 mm per year or less), frequent dense fog, and overall lower temperatures than in the rest of the country. In winter, occasionally a condition known as Berg wind or Oosweer (Afrikaans: East weather) occurs, a hot dry wind blowing from the inland to the coast. As the area behind the coast is a desert, these winds can develop into sand storms with sand deposits in the Atlantic Ocean visible on satellite images. The Central Plateau and Kalahari areas have wide diurnal temperature ranges of up to 30C. Water sources Namibia is the driest country in sub-Saharan Africa and depends largely on groundwater. With an average rainfall of about per annum, the highest rainfall occurs in the Caprivi in the northeast (about per annum) and decreases in a westerly and southwesterly direction to as little as and less per annum at the coast. The only perennial rivers are found on the national borders with South Africa, Angola, Zambia, and the short border with Botswana in the Caprivi. In the interior of the country, surface water is available only in the summer months when rivers are in flood after exceptional rainfalls. Otherwise, surface water |
and educational institutions. Nearly all Portuguese settlers came to the country from the former Portuguese colony of Angola. The 1960 census reported 526,004 persons in what was then South West Africa, including 73,464 whites (14%). Languages Oshiwambo - 48.9% Khoekhoegowab - 11.3% Afrikaans - 10.4% Otjiherero - 8.6% RuKwangali - 8.5% siLozi - 4.8% English (official language) - 3.4% Setswana - 0.3% Other African languages - 2.3% Other - 1.7% Religion Missionary work during the 19th century drew many Namibians to Christianity, especially Lutheranism. While most Namibian Christians are Lutheran, there also are Roman Catholic, Methodist, Anglican, African Methodist Episcopal, and Dutch Reformed Christians represented. Christian 80% to 90% (at least 50% Lutheran) Indigenous beliefs 10% to 20% Other demographic statistics Modern education and medical care have been extended in varying degrees to most rural areas in recent years. The literacy rate of Africans is generally low except in sections where missionary and government education efforts have been concentrated, such as Ovamboland. The Africans speak various indigenous languages. Demographic statistics according to the World Population Review in 2019. One birth every 7 minutes One death every 29 minutes One net migrant every 1440 minutes Net gain of one person every 10 minutes The following demographic are from the CIA World Factbook unless otherwise indicated. Population 2,533,224 (July 2018 est.) Age structure 0-14 years: 36.54% (male 467,392 /female 458,190) 15-24 years: 20.34% (male 257,190 /female 257,984) 25-54 years: 34.74% (male 421,849 /female 458,118) 55-64 years: 4.46% (male 50,459 /female 62,478) 65 years and over: 3.93% (male 42,381 /female 57,183) (2018 est.) Birth rate 25.33 births/1,000 population (2021 est.) Country comparison to the world: 45th Death rate 7.07 deaths/1,000 population (2021 est.) Country comparison to the world: 117th Total fertility rate 3.03 children born/woman (2021 est.) Country comparison to the world: 48th Median age total: 21.8 years. country comparison to the world: 183rd male: 21.1 years female: 22.6 years (2021 est.) Population growth rate 1.83% (2020 est.) Country comparison to the world: 45th Mother's mean age at first birth 21.5 years (2013 est.) note: median age at first birth among women 25-29 Contraceptive prevalence rate 56.1% (2013) Net migration rate 0 migrant(s)/1,000 population (2017 est.) Country comparison to the world: 94th Dependency ratios total dependency ratio: 68.1 (2015 est.) youth dependency ratio: 62.2 (2015 est.) elderly dependency ratio: 5.8 (2015 est.) potential support ratio: 17.1 (2015 est.) Urbanization urban population: 50% of total population (2018) rate of urbanization: 4.2% annual rate of change (2015-20 est.) Sex ratio at birth: 1.03 male(s)/female (2017 est.) under 15 years: 1.02 male(s)/female (2017 est.) 15–64 years: 0.99 male(s)/female (2017 est.) 65 years and over: 0.75 male(s)/female (2017 est.)total population: 0.96 male(s)/female (2017 est.) Life expectancy at birth total population: 65.87 years (2021 est.) Country comparison to the world: 197th male: 63.9 | 2011 the total fertility rate was 3.6 children per woman, down from 4.1 in 2001. UN estimates According to the total population was in , compared to only 485 000 in 1950. The proportion of children below the age of 15 in 2010 was 36.4%, 59.9% was between 15 and 65 years of age, while 3.7% was 65 years or older . Vital statistics Registration of vital events in Namibia is not complete. The Population Departement of the United Nations prepared the following estimates. Births and deaths Fertility and births Total Fertility Rate (TFR) (followed by wanted fertility rate in brackets) and Crude Birth Rate (CBR): Fertility data as of 2013 (DHS Program): Life expectancy at birth Life expectancy from 1950 to 2015 (UN World Population Prospects): Ethnic groups The majority of the Namibian population is of Bantu-speaking origin—mostly of the Ovambo ethnicity, which forms about half of the population—residing mainly in the north of the country, although many are now resident in towns throughout Namibia. Other ethnic groups are the Herero and Himba people, who speak a similar language, and the Damara, who speak the same "click" language as the Nama. In addition to the Bantu majority, there are large groups of Khoisan (such as Nama and San), who are descendants of the original inhabitants of Southern Africa. The country also contains some descendants of refugees from Angola. There are also two smaller groups of people with mixed racial origins, called "Coloureds" and "Basters", who together make up 8.0% (with the Coloureds outnumbering the Basters two to one). There is a substantial Chinese minority in Namibia; it stood at 40,000 in 2006. Whites (mainly of Afrikaner, German, British and Portuguese origin) make up between 4.0 and 7.0% of the population. Although their proportion of the population decreased after independence due to emigration and lower birth rates, they still form the second-largest population of European ancestry, both in terms of percentage and actual numbers, in Sub-Saharan Africa (after South Africa). The majority of Namibian whites and nearly all those who are of mixed race, speak Afrikaans and share similar origins, culture, and religion as the white and coloured populations of South Africa. A large minority of whites (around 30,000) trace their family origins back to the German settlers who colonised Namibia prior to the British confiscation of German lands after World War I, and they maintain German cultural and educational institutions. Nearly all Portuguese settlers came to the country from the former Portuguese colony of Angola. The 1960 census reported 526,004 persons in what was then South West Africa, including 73,464 whites (14%). Languages Oshiwambo - 48.9% Khoekhoegowab - 11.3% Afrikaans - 10.4% Otjiherero - 8.6% RuKwangali - 8.5% siLozi - 4.8% English (official language) - 3.4% Setswana - 0.3% Other African languages - 2.3% Other - 1.7% Religion Missionary work during the 19th century drew many Namibians to Christianity, especially Lutheranism. While most Namibian Christians are Lutheran, there also are Roman Catholic, Methodist, Anglican, African Methodist Episcopal, |
employed during the colonial period. The government is still organising itself on both national and regional levels. The Constituent Assembly converted itself into the National Assembly on 16 February 1990, retaining all the members elected on a straight party ticket. President The Namibian head of state is the president, elected by popular vote every five years. Namibia's founding president is Sam Nujoma, who was in office for three terms from 21 March 1990 (Namibia's Independence Day) until 21 March 2005. Hifikepunye Pohamba was Namibia's second president serving from 2005 to 2015. Since 2015 Hage Geingob has been president of Namibia. Separation of powers While the separation of powers is enshrined in the country's constitution, Namibia's civil society and the opposition repeatedly have criticised the overlap between executive and legislature. All cabinet members also sit in the National Assembly and dominate that body—not numerically but by being the superiors to ordinary members. Executive branch The government is headed by the prime minister, who, together with their cabinet, is appointed by the president. SWAPO, the primary force behind independence, is still the country's largest party. Hage Geingob was Namibia's first prime minister. He was appointed on 21 March 1990 and served until 28 August 2002. Theo-Ben Gurirab was prime minister from 28 August 2002 to 21 March 2005, and Nahas Angula occupied this position from 21 March 2005 to 4 December 2012. He was succeeded by Hage Geingob, who in turn was succeeded as prime minister by Saara Kuugongelwa when he became president of Namibia on 21 March 2015. Legislative branch Parliament has two chambers, consisting of a National Assembly (lower house), elected for a five-year term, and a National Council (upper house), elected for a six-year term. The Assembly is the primary legislative body, with the Council playing more of an advisory role. From Namibian independence until 2014 the National Assembly consisted of 78 members, 72 members elected by proportional representation and 6 members appointed by the president. The National Council had 26 representatives of the Regional Councils. Every Regional Council in the 13 regions of Namibia elected two representatives to serve on this body. Prior to the 2014 general elections the constitution was amended. Since then there are 104 seats in the National Assembly (96 elected, 8 appointed), and 42 seats in the National Council (3 from each region, with the number of regions increased to 14). Judicial branch The highest judicial body is the Supreme Court, whose judges are appointed by the president on the recommendation of the Judicial Service Commission. The judicial structure in Namibia parallels that of South Africa. In 1919, Roman-Dutch law was declared the common law of the territory and remains so to the present. Political parties and elections Elections were held in 1992, to elect members of 13 newly established Regional Councils, as well as new municipal officials. Two members from each Regional Council serve simultaneously as members of the National Council, the country's second house of Parliament. Nineteen of its members are from the ruling SWAPO party, and seven are from the Democratic Turnhalle Alliance (DTA). In December 1994, elections were held for the President and the National Assembly. Namibia has about 40 political groups, ranging from modern political parties to traditional groups based on tribal authority. Some represent | executive and the legislature. Additional to the government political structure Namibia has a network of traditional leadership with currently 51 recognised traditional authorities and their leaders. These authorities cover the entire Namibian territory. Traditional leaders are entrusted with the allocation of communal land and the formulation of the traditional group's customary laws. They also take over minor judicial work. Constitution The Constituent Assembly of Namibia produced a constitution which established a multi-party system and a bill of rights. It also limited the executive president to two five-year terms and provided for the private ownership of property. The three branches of government are subject to checks and balances, and a provision is made for judicial review. The constitution also states that Namibia should have a mixed economy, and foreign investment should be encouraged. The constitution is noted for being one of the first to incorporate protection of the environment into its text. Namibia is a democratic but one party dominant state with the South-West Africa People's Organisation in power. Opposition parties are allowed, but are widely considered to have no real chance of gaining power. While the ethnic-based, three-tier, South African-imposed governing authorities have been dissolved, the current government pledged for the sake of national reconciliation to retain civil servants employed during the colonial period. The government is still organising itself on both national and regional levels. The Constituent Assembly converted itself into the National Assembly on 16 February 1990, retaining all the members elected on a straight party ticket. President The Namibian head of state is the president, elected by popular vote every five years. Namibia's founding president is Sam Nujoma, who was in office for three terms from 21 March 1990 (Namibia's Independence Day) until 21 March 2005. Hifikepunye Pohamba was Namibia's second president serving from 2005 to 2015. Since 2015 Hage Geingob has been president of Namibia. Separation of powers While the separation of powers is enshrined in the country's constitution, Namibia's civil society and the opposition repeatedly have criticised the overlap between executive and legislature. All cabinet members also sit in the National Assembly and dominate that body—not numerically but by being the superiors to ordinary members. Executive branch The government is headed by the prime minister, who, together with their cabinet, is appointed by the president. SWAPO, the primary force behind independence, is still the country's largest party. Hage Geingob was Namibia's first prime minister. He was appointed on 21 March 1990 and served until 28 August 2002. Theo-Ben Gurirab was prime minister from 28 August 2002 to 21 March 2005, and Nahas Angula occupied this position from 21 March 2005 to 4 December 2012. He was succeeded by Hage Geingob, who in turn was succeeded as prime minister by Saara Kuugongelwa when he became president of Namibia on 21 March 2015. Legislative branch Parliament has two chambers, consisting of a National Assembly (lower house), elected for a five-year term, and a National Council (upper house), elected for a six-year term. The Assembly is the primary legislative body, with the Council playing more of an advisory role. From Namibian independence until 2014 the National Assembly consisted of 78 members, 72 members elected by proportional representation and 6 members appointed by the president. The National Council had 26 representatives of the Regional Councils. Every Regional Council in the 13 regions of Namibia elected two representatives to serve on this body. Prior to the 2014 general elections the constitution was amended. Since then there are 104 seats in the National Assembly (96 elected, 8 appointed), and 42 seats in the National Council (3 from each region, with the number of regions increased to 14). Judicial branch The highest judicial body is the Supreme Court, whose judges are appointed by the president on the recommendation of the Judicial Service Commission. The judicial structure in Namibia parallels |
over-exploitation of these resources. This trend appears to have been halted and reversed since independence, as the Namibian Government is now pursuing a conservative resource management policy along with an aggressive fisheries enforcement campaign. The government seeks to develop fish-farming as an alternative. On 12 November 2019, WikiLeaks published thousands of documents and email communication by Samherji's employees, called the Fishrot Files, that indicated hundreds of millions ISK had been paid to high ranking politicians and officials in Namibia with the objective of acquiring the country's coveted fishing quota. Manufacturing and infrastructure In 2000, Namibia's manufacturing sector contributed about 20% of GDP. Namibian manufacturing is inhibited by a small domestic market, dependence on imported goods, limited supply of local capital, widely dispersed population, small skilled labour force and high relative wage rates, and subsidised competition from South Africa. Walvis Bay is a well-developed, deepwater port, and Namibia's fishing infrastructure is most heavily concentrated there. The Namibian Government expects Walvis Bay to become an important commercial gateway to the Southern African region. Namibia also boasts world-class civil aviation facilities and an extensive, well-maintained land transportation network. Construction is underway on two new arteries—the Trans-Caprivi Highway and Trans-Kalahari Highway—which will open up the region's access to Walvis Bay. The Walvis Bay Export Processing Zone operates in the key port of Walvis Bay. Tourism Tourism is a major contributor (14.5%) to Namibia's GDP, creating tens of thousands of jobs (18.2% of all employment) directly or indirectly and servicing over a million tourists per annum. The country is among the prime destinations in Africa and is known for ecotourism which features Namibia's extensive wildlife. There are many lodges and reserves to accommodate eco-tourists. Sport Hunting is also a large, and growing component of the Namibian economy, accounting for 14% of total tourism in the year 2000, or $19.6 million US dollars, with Namibia boasting numerous species sought after by international sport hunters. In addition, extreme sports such as sandboarding, skydiving and 4x4ing have become popular, and many cities have companies that provide tours. The most visited places include the Caprivi Strip, Fish River Canyon, Sossusvlei, the Skeleton Coast Park, Sesriem, Etosha Pan and the coastal towns of Swakopmund, Walvis Bay and Lüderitz. In 2020, it would be estimated that tourism would bring is $26 million Namibian dollars however due to the COVID-19 pandemic, Namibia saw a reduction of almost 90% in tourism. In the third quarter of 2021, there was an increase in tourism, however, it is estimated that it will be until 2023 when tourism returns to some kind of normality. Labour While many Namibians are economically active in one form or another, the bulk of this activity is in the informal sector, primarily subsistence agriculture. A large number of Namibians seeking jobs in the formal sector are held back due to a lack of necessary skills or training. The government is aggressively pursuing education reform to overcome this problem. Namibia has a high unemployment rate. "Strict unemployment" (people actively seeking a full-time job) stood at 20.2% in 1999, 21.9% in 2002, and spiraled to 29,4 per cent in 2008. A 2012 study by the Namibia Statictics Agency (NSA) determined the rate of unemployment to be 27.4%. This study included subsistence farmers, work without pay, and any non-zero amount of weekly working hours, and did not count as unemployed people not actively seeking for a job. Under a much broader definition (including people that have given up searching for employment) two different studies determined the unemployment rate to be 36.7% (2004) and 51.2% (2008), respectively. This estimate considers people in the informal economy as employed. 72% of jobless people have been unemployed for two years or more. Labour and Social Welfare Minister Immanuel Ngatjizeko praised the 2008 study as "by far superior in scope and quality to any that has been available previously", but its methodology has also received criticism. The total number of formally employed people stood at 400,000 in 1997 and fell 330,000 in 2008, according to a government survey. The NSA 2012 study counted 396,000 formal employees. Of annually 25,000 school leavers only 8,000 gain formal employment—largely a result of a failed education system. Namibians in the informal sector as well as in low-paid jobs like homemakers, gardeners or factory workers are unlikely to be covered by medical aid or a pension fund. All in all only a quarter or the working population have medical aid, and about half have a pension fund. Namibia's largest trade union federation, the National Union of Namibian Workers (NUNW) represents workers organised into seven affiliated trade unions. NUNW maintains a | facilitate this goal, the government has actively courted donor assistance and foreign investment. The liberal Foreign Investment Act of 1990 provides guarantees against nationalisation, freedom to remit capital and profits, currency convertibility, and a process for settling disputes equitably. Namibia also is addressing the sensitive issue of agrarian land reform in a pragmatic manner. However, the government runs and owns a number of companies such as TransNamib and NamPost, most of which need frequent financial assistance to stay afloat. The country's sophisticated formal economy is based on capital-intensive industry and farming. However, Namibia's economy is heavily dependent on the earnings generated from primary commodity exports in a few vital sectors, including minerals, especially diamonds, livestock, and fish. Furthermore, the Namibian economy remains integrated with the economy of South Africa, as 47% of Namibia's imports originate from there. In 1993, Namibia became a signatory of the General Agreement on Tariffs and Trade (GATT), and the Minister of Trade and Industry represented Namibia at the Marrakech signing of the Uruguay Round Agreement in April 1994. Namibia also is a member of the International Monetary Fund and the World Bank. Regional integration Given its small domestic market but favourable location and a superb transport and communications base, Namibia is a leading advocate of regional economic integration. In addition to its membership in the Southern African Development Community (SADC), Namibia presently belongs to the Southern African Customs Union (SACU) with South Africa, Botswana, Lesotho, and Swaziland. Within SACU, there is no customs on goods produced in, and being transported amidst, its members. Namibia is a net receiver of SACU revenues; they are estimated to contribute 13.9 billion NAD in 2012. The Namibian economy is closely linked to South Africa with the Namibian dollar pegged to the South African rand. Privatisation of several enterprises in coming years may stimulate long-run foreign investment, although with the trade union movement opposed, so far most politicians have been reluctant to advance the issue. In September 1993, Namibia introduced its own currency, the Namibia Dollar (N$), which is linked to the South African Rand at a fixed exchange rate of 1:1. There has been widespread acceptance of the Namibia Dollar throughout the country and, while Namibia remains a part of the Common Monetary Area, it now enjoys slightly more flexibility in monetary policy although interest rates have so far always moved very closely in line with the South African rates. Namibia imports almost all of its goods from South Africa. Many exports likewise go to the South African market, or transit that country. Namibia's exports consist mainly of diamonds and other minerals, fish products, beef and meat products, karakul sheep pelts, and light manufactures. In recent years, Namibia has accounted for about 5% of total SACU exports, and a slightly higher percentage of imports. Namibia is seeking to diversify its trading relationships away from its heavy dependence on South African goods and services. Europe has become a leading market for Namibian fish and meat, while mining concerns in Namibia have purchased heavy equipment and machinery from Germany, the United Kingdom, the United States, and Canada. The Government of Namibia is making efforts to take advantage of the American-led African Growth and Opportunity Act (AGOA), which will provide preferential access to American markets for a long list of products. In the short term, Namibia is likely to see growth in the apparel manufacturing industry as a result of AGOA. Data The following table shows the main economic indicators in 1990–2017. Sectors Namibia is heavily dependent on the extraction and processing of minerals for export. Taxes and royalties from mining account for 25% of its revenue. The bulk of the revenue is created by diamond mining, which made up 7.2% of the 9.5% that mining contributes to Namibia's GDP in 2011. Rich alluvial diamond deposits make Namibia a primary source for gem-quality diamonds. Namibia is a large exporter of uranium and over the years the mining industry has seen a decline in the international commodity prices such as uranium, which has led to the reason behind several uranium projects being abandoned. Experts say that the prices are expected to rise in the next 3 years because of an increase in nuclear activities from both Japan and China. The mining industry in Namibia is supposedly going to reach US1.79bn by the year 2018. Mining and energy Diamond production totalled 1.5 million carats (300 kg) in 2000, generating nearly $500 million in export earnings. Other important mineral resources are uranium, copper, lead, and zinc. The country also extracts gold, silver, tin, vanadium, semiprecious gemstones, tantalite, phosphate, sulphur, and mines salt. Namibia is the fourth-largest exporter of nonfuel minerals in Africa, the world's fourth-largest producer of uranium, and the producer of large quantities of lead, zinc, tin, silver, and tungsten. Namibia has two uranium mines that are capable of providing 10% of the world mining output. The mining sector employs only about 3% of the population while about half of the population depends on subsistence agriculture for its livelihood. Namibia normally imports about 50% of its cereal requirements; in drought years food shortages are a major problem in rural areas. During the pre-independence period, large areas of Namibia, including off-shore, were leased for oil prospecting. Some natural gas was discovered in 1974 in the Kudu Field off the mouth of the Orange River, but the extent of this find is only now being determined. Agriculture About half of the population depends on agriculture (largely subsistence agriculture) for its livelihood, but Namibia must still import some of its food. Although per capita GDP is five times the per capita GDP of Africa's poorest countries, the majority of Namibia's people live in rural areas and exist on a subsistence way of life. Namibia has one of the highest rates of income inequality in the world, due in part to the fact that there is an urban economy and a more rural cash-less economy. The inequality figures thus take into account people who do not actually rely on the formal economy for their survival. Although arable land accounts for only 1% of Namibia, nearly half of the population is employed in agriculture. About 4,000, mostly white, commercial farmers own almost half of Namibia's arable land. Agreement has been reached on the privatisation of several more enterprises in coming years, with hopes that this will stimulate much needed foreign investment. However, reinvestment of environmentally derived capital has hobbled Namibian per capita income. One of the fastest growing areas of economic development in Namibia is the growth |
good system; core fiber-optic network links most centers and connections are now digital; multiple mobile-cellular providers with a combined subscribership of more than 100 telephones per 100 persons; fiber-optic cable to South Africa, microwave radio relay link to Botswana, direct links to other neighboring countries (2010). Communications cables: connected to the African Coast to Europe (ACE) and the West Africa Cable System (WACS) submarine cables, as well as the South African Far East (SAFE) submarine cable through South Africa (2010). Satellite earth stations: 4 Intelsat (2010). Internet Top-level domain: .na Internet users: 1,291,944 users, 134th in the world; 51% of the population (2019). Fixed broadband: 61,698 subscriptions, 131st in the world (2019). Wireless broadband: 624,257 subscriptions, 86th in the world; 28.8% of the population, 56th in the world (2012). Internet hosts: 78,280 hosts, 84th in the world (2012). IPv4: 199,168 addresses allocated, less than 0.05% of the world total, 92.0 addresses per 1000 people (2012). Telecom Namibia, which has offered ADSL access since late 2006, has a de facto monopoly on ADSL access. Their monopoly was unsuccessfully challenged in the courts by MWeb Namibia in May 2007 and again in August 2011. In February 2007, ISP Namibia Mweb began offering broadband wireless services through WiMax, making Namibia the second African country (after Mozambique) to do so. Internet censorship and surveillance There are no government restrictions on access to the Internet; however, the Communications Act provides that the intelligence services can monitor e-mail and Internet usage with authorization from any magistrate. There have been some allegations and rumors that the government reviewed ways to block or curtail social media sites, but there is no concrete evidence of such action. The constitution provides for freedom of speech and of the press, and the government generally respects these rights. See also Mobile Telecommunications Limited Namibia (MTC-Namibia), a mobile phone company 66% of which is owned by Namibia Post and Telecom Holdings Limited which is intern wholly owned by the Namibian Government. TN Mobile, a mobile telecommunications company 100% owned by Telecom Namibia which is owned by Namibia Post and Telecom Holdings Limited which is | FM, etc.) and community (UNAM Radio, Katutura Community Radio, etc.) receiving licences. Most of these stations broadcast various types of music format, and political discussions, news and phone-in programs remain mostly the domain of the national broadcaster (NBC) which broadcasts nine radio services nationally (in various Namibian languages, including German - the only full-time German service outside of Europe), plus the new !Ha service, broadcasting to the San community in Tsumkwe. Television stations: 1 private and 1 state-run TV station; satellite and cable TV service is available; transmissions of multiple international broadcasters are available (2007). The television network with the widest transmission range is the Namibian Broadcasting Corporation (NBC, not to be confused with the American NBC network). The NBC is the successor to the South Africa–run South West African Broadcasting Corporation (SWABC), which was modeled on the original SABC. Like the radio services of the NBC, the television service tries to cater to all the linguistic audiences in Namibia, although the dominant language is English (Namibia's official language). The commercial "free to air" station is One Africa Television, the successor to the now defunct TV Africa. It has expanded its transmitter network and is now available in most major towns and cities in Namibia. In 2007 it commenced broadcasting a local television news bulletin each evening. The Trinity Broadcasting Network (TBN) is a religion television station, with some material originating locally, although also carrying relays from the United States. It is based in Windhoek and holds a community television licence granted in 2001. Telephones Calling code: +264 International call prefix: 00 Main lines in use: 144,575 lines, 126th in the world (2019); 171,000 lines (2012); 140,000 lines (2008); 127,900 lines (2004); 110,200 lines (2000); 100,848 lines (1997). Mobile cellular: 2.92 million lines, 142nd in the world (2019); 2.4 million lines (2012); 1.1 million lines (2008); 450,000 lines (2006); 495,000 lines (2005); 82,000 lines (2000 estimate); 20,000 lines (1998). Telephone system: good system; core fiber-optic network links most centers and connections are now digital; multiple mobile-cellular providers with a combined subscribership of more than 100 telephones per 100 persons; fiber-optic cable to South Africa, microwave radio relay link to Botswana, direct links to other neighboring countries (2010). Communications cables: connected to the African Coast to Europe (ACE) and the West Africa Cable System (WACS) submarine cables, as well as the South African Far East (SAFE) submarine cable through South Africa (2010). |
one of its kind. The first permanent road, established for ox wagons, was built at the initiative of Heinrich Schmelen, Rhenish missionary in Bethanie in the early 19th century. It led from Bethanie to Angra Pequeña, today the town of Lüderitz, and was intended to serve the natural harbour there in order to become independent of the Cape Colony. Road Namibia's road network is regarded as one of the best on the continent; road construction and maintenance adheres to international standards. The country's 48,875.27 km roads (2017) are administered by the Roads Authority, a state-owned enterprise established by Act 17 of 1999. Due to low traffic volumes the majority of roads are not tarred. The distribution of road surfaces is: standard bitumen road low-volume bitumen road. These roads have the same base layer as gravel roads but are covered with a thin layer of bitumen to reduce maintenance cost and dust formation. standard gravel road, covered with imported gravel. earth-graded road. These roads are built by clearing the vegetation and blading the surface. Compaction is achieved by the traffic using the road. Some of these roads are not graded at all but just consist of earth or sand tracks separated by vegetation. These tracks are in use where a daily usage of less than five vehicles is expected. salt road. These roads consist of concentrated salt water and gypsum-rich material. They are only built near the Atlantic coast inside the mist belt. Roads by region (2017) The major highways in Namibia are as | of Lüderitz, and was intended to serve the natural harbour there in order to become independent of the Cape Colony. Road Namibia's road network is regarded as one of the best on the continent; road construction and maintenance adheres to international standards. The country's 48,875.27 km roads (2017) are administered by the Roads Authority, a state-owned enterprise established by Act 17 of 1999. Due to low traffic volumes the majority of roads are not tarred. The distribution of road surfaces is: standard bitumen road low-volume bitumen road. These roads have the same base layer as gravel roads but are covered with a thin layer of bitumen to reduce maintenance cost and dust formation. standard gravel road, covered with imported gravel. earth-graded road. These roads are built by clearing the vegetation and blading the surface. Compaction is achieved by the traffic using the road. Some of these roads are not graded at all but just consist of earth or sand tracks separated by vegetation. These tracks are in use where a daily usage of less than five vehicles is expected. salt road. These roads consist of concentrated salt water and gypsum-rich material. They are only built near the Atlantic coast inside the mist belt. Roads by region (2017) The major highways in Namibia are as follows: (freeway) from Windhoek to Okahandja, . in two discontinuous sections, first running from Noordoewer (South African border) to the southern terminus of the A1 in Windhoek, then resuming at the northern terminus of the A1 in Okahandja and running to Oshikango (Angolan border). Combined, the two sections have a length of . from Walvis Bay to Okahandja, . from Ariamsvlei (South African border) to Grünau, . from Lüderitz |
2011–2013 Lieutenant-General Epaphras Denga Ndaitwah, Namibian Army. 2013–2020 Lieutenant-General John Mutwa, Namibian Army. 2020 – Incumbent Air Marshal Martin Pinehas, Namibian Air Force. NDF Sergeant Major NDF Sergeant Major is the highest appointment a Non Commission Officer may receive. Duties of the NDF Sergeant Major includes making sure that discipline, drills, dressing code, performance standards and morale of the non-commissioned officers are maintained. The current NDF Sergeant Major is Warrant Officer Class 1 (WO1) Leonard Iiyambo. He had succeeded WO1 Albert Siyaya, who in turn took over from retired Namibian Navy WO1 Isak Nankela. Previous Sergeant Major are: 1990–1997 WO1 retired K. Lossen, Namibian Army 1997–2000 Late WO1 retired A.H.Vatileni, Namibian Army 2000–2007 WO1 retired E.K. Mutota, Namibian Army 2007–2011 WO1 retired D.J. Angolo, Namibian Navy 2011–2017 WO1 retired Isak Nankela, Namibian Air Force 2017–2018 WO1 Albert Siyaya, Namibian Air Force 2018–2019 WO1 Leonard Iiyambo, Namibian Army 2019– Incumbent WO1 Joseph Nembungu, Namibian Air Force Joint Operations Directorate The Joint Operations Directorate is the only directorate headed by a two star Flag/Air/General Officer. Its role is to coordinate and conduct combined Operations, implement plans and doctrines in the force. The first Director of Operation in 1990 was Brigadier General Martin Shalli. Defence Health Services The Force's Defence Health Services provides medical services to service personnel, it operates sick bays at all bases and units as well the military hospitals. Logistics Directorate The Logitistcs Directorate is responsible for the supplying materiel to the force.The first Director for Logistics was Colonel Peter Nambundunga Defence Inspectorate The Defence Inspector General's Directorate is responsible for maintaining the efficiency and effectivennes of the Force. It also investigates both internal and external complaints. The current Defence inspector General is Brigadier General Fiina Amupolo. Namibian Defence Force ranks NDF ranks are based on the Commonwealth rank structure. There is no approved four-star general rank in the NDF. The Chief of Defence Force is a singular appointment that comes with an elevation to the rank of lieutenant general for an Army officer, air marshal for an Air Force officer and vice admiral for a Navy officer. Arms of services commanders i.e. Army, Air Force and Navy commanders, have a rank of major general, air vice marshal and rear admiral. The rank of brigadier has also been transformed into brigadier general. Directorate heads are always brigadier generals, i.e. the Chief of Staff for Defence Intelligence. Warrant Officer Class 1 Appointments Any warrant officer class 1 could be posted to substantive posts, including Army The landward arm of service for the Defence force is the Namibian Army, it is also the largest of the NDF's service branches. Air Force The aerial warfare branch is small, but was bolstered with deliveries of some fighter jets in 2006 and 2008. Navy Development of the maritime warfare branch has been slow, and the force was only formally established in 2004, 14 years after independence. Today, it numbers over 1100 personnel and deploys a small number of lightly armed patrol vessels. Extensive Brazilian aid assisted in its development. Joint Headquarters The Joint Headquarters is an Arm of Service level institution in the Defence Force and is created by the Minister of Defence in terms of section 13 of the Defence Act. Training Institutions Army Battle School Situated at the Oshivelo Army base, the school offers units and battle groups to test their combat fighting skills in conventional and non-conventional warfare. The school also offers courses such as: Company Group Commander Course(CGC) Platoon commander Course(PCC) Platoon Sgt Commanders Course Section Commanders Course Army Technical Training Centre Established in 2011 the technical centre impart students with knowledge repair and maintain army systems and installations. The centre was commissioned on 27 February 2015. Military School Okahandja The Namibian Military School is the main training and academic unit of the Namibian Defence Force. It offers Officer Cadets and NDF officers an opportunity to get a military-oriented academic qualification. Training and teaching in the institution ranges from Basic Military Training to technical mechanical training. Namibia Command and Staff College The Namibia Command and Staff College offers the Junior Staff Course (JSC) and the Senior Command and Staff Course (SCSC). It provides staff training to prepare students for staff appointments. Parachute Training School The force's parachute airborne school is based at the Grootfontein Air Force Base. Here students from all service branches are training to qualify as Parachute specialists. The school was set up with help by the South African private military parachute training company Chute Systems who are training Namibia's airborne forces and associated staff e.g. parachute riggers. Naval Training School Established on 22 November 2009, the Naval Training School was commissioned by President Hage Geingob on 22 July 2016. Administratively divided into two sections, Sailors Training Wing and Marine Training Wing. Its commanding officer is Captain (N) Lazarus Erastus. It offers the following courses: Basic Seamanship Course Specialization Course Section Commander Course. Marine Petty officers course Sailors Petty officers Course School of Air Power Studies The School of Air Power Studies run in conjunction with the Namibia Aviation Training Academy trains Pilots and Technicians. School of Military Science The School of Military Science, run in conjunction with the University of Namibia, offers officers in the Defence force qualifications ranging from bachelor of Science Honors' degrees in the field of nautical, Army and Aeronautical, to a post-graduate diploma in Security and strategics studies, and a Master of Arts in Security and Strategic Studies (MA-SSS). School of Signals The School of Signals provides training to all personnel wishing to specialize in the communications field. The signals school was formed in | officers, air officers and flag officers. The exception, however, is the Joint Operations Directorate, whose head is a major general. The Joint Operations Directorate is responsible for force deployment in the Military. Chief of Defence Force: Air Marshal Martin Pinehas Army Commander: Major-General Matheus Alueendo Air Force Commander: Air Vice Marshal Teofilus Shaende Navy Commander: Rear Admiral Alweendo Amungulu Chief of Staff; Joint Operations: Maj Gen Joshua Ndandalwakwasha Namhindo Chief of Staff; Human Resources: Brig Gen John I. Robinson Chief of Staff; Defence Intelligence: Rear Admrl (JG) S.S. Hangula Chief of Staff; Defence Health Services: Brig Gen Dr. S.S. Ndeitunga Chief of Staff; Information & Communication Technology: Brig Gen Abisai Heita Chief of Staff; Logistics: Chief of Staff; Defence Inspector General: Chief of Defence Force The Chief of the Defence Force (Namibia) is always a commissioned three star General/Air/Flag Officer from the officer corps. The first chief of the NDF was Lieutenant-General Dimo Hamaambo. He was previously the leader of PLAN, and a survivor of the Battle of Cassinga. Lieutenant-General Hamaambo was the first to be laid to rest at the Heroes' Acre memorial outside Windhoek, a few days after its official opening in 2002. Lieutenant-General Solomon Huwala replaced Hamaambo as Chief of the NDF on Hamaambo's retirement. After Lieutenant-General Huwala retired in October 2006, Lieutenant General Martin Shalli headed the NDF. President Hifikepunye Pohamba suspended Lieutenant-General Shalli from his post as Chief of Defence Force in 2009 over corruption allegations, dating back to the time when Shalli served as Namibia's High Commissioner to Zambia. During the time of the suspension, Army Commander Major General Peter Nambundunga acted as Chief. Shalli was eventually forced to retire in January 2011; the post of Chief of the NDF was given to Epaphras Denga Ndaitwah. Ndaitwah served until 31 December 2013 when the NDF Chief's position was given to Maj Gen John Mutwa. As of February 2012, it was reported that a Chinese company paid US$499,950 into Shalli's account in Zambia while he was the NDF chief. Poly Technologies was supplying equipment to the NDF at the time. 1990–2000 Lieutenant-General Dimo Hamaambo, Namibian Army. 2000–2006 Lieutenant-General Solomon Huwala, Namibian Army. 2006–2011 Lieutenant-General Martin Shalli, Namibian Army. 2011–2013 Lieutenant-General Epaphras Denga Ndaitwah, Namibian Army. 2013–2020 Lieutenant-General John Mutwa, Namibian Army. 2020 – Incumbent Air Marshal Martin Pinehas, Namibian Air Force. NDF Sergeant Major NDF Sergeant Major is the highest appointment a Non Commission Officer may receive. Duties of the NDF Sergeant Major includes making sure that discipline, drills, dressing code, performance standards and morale of the non-commissioned officers are maintained. The current NDF Sergeant Major is Warrant Officer Class 1 (WO1) Leonard Iiyambo. He had succeeded WO1 Albert Siyaya, who in turn took over from retired Namibian Navy WO1 Isak Nankela. Previous Sergeant Major are: 1990–1997 WO1 retired K. Lossen, Namibian Army 1997–2000 Late WO1 retired A.H.Vatileni, Namibian Army 2000–2007 WO1 retired E.K. Mutota, Namibian Army 2007–2011 WO1 retired D.J. Angolo, Namibian Navy 2011–2017 WO1 retired Isak Nankela, Namibian Air Force 2017–2018 WO1 Albert Siyaya, Namibian Air Force 2018–2019 WO1 Leonard Iiyambo, Namibian Army 2019– Incumbent WO1 Joseph Nembungu, Namibian Air Force Joint Operations Directorate The Joint Operations Directorate is the only directorate headed by a two star Flag/Air/General Officer. Its role is to coordinate and conduct combined Operations, implement plans and doctrines in the force. The first Director of Operation in 1990 was Brigadier General Martin Shalli. Defence Health Services The Force's Defence Health Services provides medical services to service personnel, it operates sick bays at all bases and units as well the military hospitals. Logistics Directorate The Logitistcs Directorate is responsible for the supplying materiel to the force.The first Director for Logistics was Colonel Peter Nambundunga Defence Inspectorate The Defence Inspector General's Directorate is responsible for maintaining the efficiency and effectivennes of the Force. It also investigates both internal and external complaints. The current Defence inspector General is Brigadier General Fiina Amupolo. Namibian Defence Force ranks NDF ranks are based on the Commonwealth rank structure. There is no approved four-star general rank in the NDF. The Chief of Defence Force is a singular appointment that comes with an elevation to the rank of lieutenant general for an Army officer, air marshal for an Air Force officer and vice admiral for a Navy officer. Arms of services commanders i.e. Army, Air Force and Navy commanders, have a rank of major general, air vice marshal and rear admiral. The rank of brigadier has also been transformed into brigadier general. Directorate heads are always brigadier generals, i.e. the Chief of Staff for Defence Intelligence. Warrant Officer Class 1 Appointments Any warrant officer class 1 could be posted to substantive posts, including Army The landward |
army and a fragile economy, the Namibian Government's principal foreign policy concern is developing strengthened ties within the Southern African region. A dynamic member of the Southern African Development Community, Namibia is a vocal advocate for greater regional integration. International disputes Namibia is involved in several minor international disputes. Commission established with Botswana to resolve small residual disputes along the Caprivi | Botswana to resolve small residual disputes along the Caprivi Strip, including the Situngu marshlands along the Linyanti River Botswana residents protest Namibia's planned construction of the Okavango hydroelectric dam on Popa Falls Managed dispute with South Africa over the location of the boundary in the Orange River Dormant dispute remains where Botswana, Namibia, Zambia, and Zimbabwe boundaries converge Angolan rebels and refugees still reside in Namibia Bilateral relations Africa Americas Asia Europe Oceania Namibia and the Commonwealth of Nations Namibia has been a Commonwealth republic since 1990, when South West Africa became independent of South Africa. See also List of diplomatic missions in Namibia List of diplomatic |
any land, and they must enter into a lease arrangement with landowners to use land. Non-Nauruans cannot own land on the island. Nauru's Supreme Court, headed by the Chief Justice, is paramount on constitutional issues. Other cases can be appealed to the two-judge Appellate Court. Parliament cannot overturn court decisions. Historically, Appellate Court rulings could be appealed to the High Court of Australia, though this happened only rarely and the Australian court's appellate jurisdiction ended entirely on 12 March 2018 after the Government of Nauru unilaterally ended the arrangement. Lower courts consist of the District Court and the Family Court, both of which are headed by a Resident Magistrate, who also is the Registrar of the Supreme Court. There are two other quasi-courts: the Public Service Appeal Board and the Police Appeal Board, both of which are presided over by the Chief Justice. Foreign relations Following independence in 1968, Nauru joined the Commonwealth of Nations as a Special Member; it became a full member in 1999. The country was admitted to the Asian Development Bank in 1991 and the United Nations in 1999. Nauru is a member of the South Pacific Regional Environment Programme, the South Pacific Commission, and the South Pacific Applied Geoscience Commission. In February 2021, Nauru announced it would be formally withdrawing from the Pacific Islands Forum in a joint statement with Marshall Islands, Kiribati, and the Federated States of Micronesia after a dispute regarding Henry Puna's election as the Forum's secretary-general. Nauru has no armed forces, though there is a small police force under civilian control. Australia is responsible for Nauru's defence under an informal agreement between the two countries. The September 2005 memorandum of understanding between Australia and Nauru provides the latter with financial aid and technical assistance, including a Secretary of Finance to prepare the budget, and advisers on health and education. This aid is in return for Nauru's housing of asylum seekers while their applications for entry into Australia are processed. Nauru uses the Australian dollar as its official currency. Nauru has used its position as a member of the United Nations to gain financial support from both Taiwan (officially the Republic of China or ROC) and mainland China (officially the People's Republic of China or PRC) by changing its recognition from one to the other under the One-China policy. On 21 July 2002, Nauru signed an agreement to establish diplomatic relations with the PRC, accepting US$130 million from the PRC for this action. In response, the ROC severed diplomatic relations with Nauru two days later. Nauru later re-established links with the ROC on 14 May 2005, and diplomatic ties with the PRC were officially severed on 31 May 2005. However, the PRC continues to maintain a representative office on Nauru. In 2008, Nauru recognised Kosovo as an independent country, and in 2009 Nauru became the fourth country, after Russia, Nicaragua, and Venezuela, to recognise Abkhazia, a breakaway region of Georgia. Russia was reported to be giving Nauru US$50 million in humanitarian aid as a result of this recognition. On 15 July 2008, the Nauruan government announced a port refurbishment programme, financed with US$9 million of development aid received from Russia. The Nauru government claims this aid is not related to its recognising Abkhazia and South Ossetia. A significant portion of Nauru's income has been in the form of aid from Australia. In 2001, the MV Tampa, a Norwegian ship that had rescued 438 refugees from a stranded 20-metre-long boat, was seeking to dock in Australia. In what became known as the Tampa affair, the ship was refused entry and boarded by Australian troops. The refugees were eventually loaded onto Royal Australian Navy vessel HMAS Manoora and taken to Nauru to be held in detention facilities which later became part of the Howard government's Pacific Solution. Nauru operated two detention centres known as State House and Topside for these refugees in exchange for Australian aid. By November 2005, only two refugees, Mohammed Sagar and Muhammad Faisal remained on Nauru from those first sent there in 2001, with Sagar finally resettling in early 2007. The Australian government sent further groups of asylum-seekers to Nauru in late 2006 and early 2007. The refugee centre was closed in 2008, but, following the Australian government's re-adoption of the Pacific Solution in August 2012, it has re-opened it. The US Atmospheric Radiation Measurement program operates a climate-monitoring facility on the island. In March 2017, at the 34th regular session of the UN Human Rights Council, Vanuatu made a joint statement on behalf of Nauru and some other Pacific nations raising human rights violations in Western New Guinea, which has been occupied by Indonesia since 1963, and requested that the UN High Commissioner for Human Rights produce a report. Indonesia rejected the allegations. More than 100,000 Papuans have died during a 50-year Papua conflict. Amnesty International has since described the conditions of the refugees of war living in Nauru as a "horror", with reports of children as young as eight attempting suicide and engaging in acts of self-harm. In 2018, the situation gained attention as a "mental health crisis", with an estimated thirty children suffering from traumatic withdrawal syndrome, also known as resignation syndrome. Administrative divisions Nauru is divided into fourteen administrative districts, which are grouped into eight electoral constituencies and are further divided into villages. The most populous district is Denigomodu, with 1,804 residents, of which 1,497 reside in an NPC settlement called "Location". The following table shows population by district according to the 2011 census. Economy The Nauruan economy peaked in the mid-1970s, when its GDP per capita was estimated to be US$50,000, second only to Saudi Arabia. Most of this came from phosphate mining, which has since declined starting in the early 1980s. There are few other resources, and most necessities are imported. Small-scale mining is still conducted by RONPhos, formerly known as the Nauru Phosphate Corporation. The government places a percentage of RONPhos's earnings into the Nauru Phosphate Royalties Trust. The trust manages long-term investments, which were intended to support the citizens after the phosphate reserves were exhausted. Because of mismanagement, the trust's fixed and current assets were reduced considerably and may never fully recover. The failed investments included financing Leonardo the Musical in 1993. The Mercure Hotel in Sydney and Nauru House in Melbourne were sold in 2004 to finance debts and Air Nauru's only Boeing 737 was repossessed in December 2005. Normal air service resumed after the aircraft was replaced with a Boeing 737-300 airliner in June 2006. In 2005, the corporation sold its remaining real estate in Melbourne, the vacant Savoy Tavern site, for $7.5 million. The value of the trust is estimated to have shrunk from A$1.3 billion in 1991 to A$138 million in 2002. Nauru currently lacks money to perform many of the basic functions of government; for example, the National Bank of Nauru is insolvent. The CIA World Factbook estimated a GDP per capita of US$5,000 in 2005. The Asian Development Bank 2007 economic report on Nauru estimated GDP per capita at US$2,400 to US$2,715. There are no personal taxes in Nauru. The unemployment rate is estimated to be 23 percent and the government employs 95 per cent of those who have jobs. The Asian | The German settlers called the island "Nawodo" or "Onawero". The Germans ruled Nauru for almost three decades. Robert Rasch, a German trader who married a Nauruan woman, was the first administrator, appointed in 1890. Phosphate was discovered on Nauru in 1900 by the prospector Albert Fuller Ellis. The Pacific Phosphate Company began to exploit the reserves in 1906 by agreement with Germany, exporting its first shipment in 1907. In 1914, following the outbreak of World War I, Nauru was captured by Australian troops. In 1919 it was agreed by the Allied and Associated Powers that His Britannic Majesty should be the administering authority under a League of Nations mandate. The Nauru Island Agreement forged in 1919 between the governments of the United Kingdom, Australia, and New Zealand provided for the administration of the island and extraction of the phosphate deposits by an intergovernmental British Phosphate Commission (BPC). The terms of the League of Nations mandate were drawn up in 1920. The island experienced an influenza epidemic in 1920, with a mortality rate of 18 per cent among native Nauruans. In 1923, the League of Nations gave Australia a trustee mandate over Nauru, with the United Kingdom and New Zealand as co-trustees. On 6 and 7 December 1940, the German auxiliary cruisers Komet and Orion sank five supply ships in the vicinity of Nauru. Komet then shelled Nauru's phosphate mining areas, oil storage depots, and the shiploading cantilever. Japanese troops occupied Nauru on 25 August 1942. The Japanese built 2 airfields which were bombed for the first time on 25 March 1943, preventing food supplies from being flown to Nauru. The Japanese deported 1,200 Nauruans to work as labourers in the Chuuk Islands, which was also occupied by Japan. As part of the Allied strategy of island hopping from the Pacific islands towards the main islands of Japan, Nauru was bypassed and left to "wither on the vine". Nauru was finally liberated on 13 September 1945, when commander Hisayaki Soeda surrendered the island to the Australian Army and the Royal Australian Navy. The surrender was accepted by Brigadier J. R. Stevenson, who represented Lieutenant General Vernon Sturdee, the commander of the First Australian Army, aboard the warship HMAS Diamantina. Arrangements were made to repatriate from Chuuk the 745 Nauruans who survived Japanese captivity there. They were returned to Nauru by the BPC ship Trienza in January 1946. In 1947, a trusteeship was established by the United Nations, with Australia, New Zealand, and the United Kingdom as trustees. Under those arrangements, the UK, Australia, and New Zealand were a joint administering authority. The Nauru Island Agreement provided for the first administrator to be appointed by Australia for five years, leaving subsequent appointments to be decided by the three governments. However, in practice, administrative power was exercised by Australia alone. In 1948, Chinese guano mining workers went on strike over pay and conditions. The Australian administration imposed a state of emergency with Native Police and armed volunteers of locals and Australian officials being mobilised. This force, using sub-machine guns and other firearms, opened fire on the Chinese workers killing two and wounding sixteen. Around 50 of the workers were arrested and two of these were bayoneted to death while in custody. The trooper who bayoneted the prisoners was charged but later acquitted on grounds that the wounds were "accidentally received." The governments of the Soviet Union and China made official complaints against Australia at the United Nations over this incident. In 1964, it was proposed to relocate the population of Nauru to Curtis Island off the coast of Queensland, Australia. By that time, Nauru had been extensively mined for phosphate by companies from Australia, Britain, and New Zealand, damaging the landscape so much that it was thought the island would be uninhabitable by the 1990s. Rehabilitating the island was seen as financially impossible. In 1962, Australian Prime Minister Robert Menzies said that the three countries involved in the mining had an obligation to provide a solution for the Nauruan people, and proposed finding a new island for them. In 1963, the Australian Government proposed to acquire all the land on Curtis Island (which was considerably larger than Nauru) and then offer the Nauruans freehold title over the island and that the Nauruans would become Australian citizens. The cost of resettling the Nauruans on Curtis Island was estimated to be , which included housing and infrastructure and the establishment of pastoral, agricultural, and fishing industries. However, the Nauruan people did not wish to become Australian citizens and wanted to be given sovereignty over Curtis Island to establish themselves as an independent nation, which Australia would not agree to. Nauru rejected the proposal to move to Curtis Island, instead choosing to become an independent nation operating their mines in Nauru. Nauru became self-governing in January 1966, and following a two-year constitutional convention, it became independent on 31 January 1968 under founding president Hammer DeRoburt. In 1967, the people of Nauru purchased the assets of the British Phosphate Commissioners, and in June 1970 control passed to the locally owned Nauru Phosphate Corporation (NPC). Income from the mines made Nauruans among the richest people in the world. In 1989, Nauru took legal action against Australia in the International Court of Justice over Australia's administration of the island, in particular, Australia's failure to remedy the environmental damage caused by phosphate mining. Certain Phosphate Lands: Nauru v. Australia led to an out-of-court settlement to rehabilitate the mined-out areas of Nauru. Geography Nauru is a , oval-shaped island in the southwestern Pacific Ocean, south of the equator. The island is surrounded by a coral reef, which is exposed at low tide and dotted with pinnacles. The presence of the reef has prevented the establishment of a seaport, although channels in the reef allow small boats access to the island. A fertile coastal strip wide lies inland from the beach. Coral cliffs surround Nauru's central plateau. The highest point of the plateau, called the Command Ridge, is above sea level. The only fertile areas on Nauru are on the narrow coastal belt, where coconut palms flourish. The land around Buada Lagoon supports bananas, pineapples, vegetables, pandanus trees, and indigenous hardwoods, such as the tamanu tree. Nauru was one of three great phosphate rock islands in the Pacific Ocean, along with Banaba (Ocean Island), in Kiribati, and Makatea, in French Polynesia. The phosphate reserves on Nauru are now almost entirely depleted. Phosphate mining in the central plateau has left a barren terrain of jagged limestone pinnacles up to high. Mining has stripped and devastated about 80 per cent of Nauru's land area, leaving it uninhabitable, and has also affected the surrounding exclusive economic zone; 40 per cent of marine life is estimated to have been killed by silt and phosphate runoff. There are limited natural sources of freshwater on Nauru. Rooftop storage tanks collect rainwater. The islanders are mostly dependent on three desalination plants housed at Nauru's Utilities Agency. Climate Nauru's climate is hot and very humid year-round because of its proximity to the equator and the ocean. Nauru is hit by monsoon rains between November and February, but rarely has cyclones. Annual rainfall is highly variable and is influenced by the El Niño–Southern Oscillation, with several significant recorded droughts. The temperature on Nauru ranges between during the day and is quite stable at around at night. Streams and rivers do not exist in Nauru. Water is gathered from roof catchment systems. Water is brought to Nauru as ballast on ships returning for loads of phosphate. Ecology Fauna is sparse on the island because of a lack of vegetation and the consequences of phosphates mining. Many indigenous birds have disappeared or become rare owing to the destruction of their habitat. There are about 60 recorded vascular plant species native to the island, none of which are endemic. Coconut farming, mining, and introduced species have seriously disturbed the native vegetation. There are no native land mammals, but there are native insects, land crabs, and birds, including the endemic Nauru reed warbler. The Polynesian rat, cats, dogs, pigs, and chickens have been introduced to Nauru from ships. The diversity of the reef marine life makes fishing a popular activity for tourists on the island; also popular are scuba diving and snorkelling. Politics The president of Nauru is Lionel Aingimea, who heads a 19-member unicameral parliament. The country is a member of the United Nations, the Commonwealth of Nations, and the Asian Development Bank. Nauru also participates in the Commonwealth and Olympic Games. Recently Nauru became a member country of the International Renewable Energy Agency (IRENA). The Republic of Nauru became the 189th member of the International Monetary Fund in April 2016. |
Nauru, which are represented in the 12-pointed star in the nation's flag. Nauruans traced their descent on the female side. The first Europeans to encounter the island were on the British whaling ship Hunter, in 1798. When the ship approached, "many canoes ventured out to meet the ship. The Hunters crew did not leave the ship nor did Nauruans board, but Captain John Fearn's positive impression of the island and its people" led to its English name, Pleasant Island. This name was used until Germany annexed the island 90 years later. From around 1830, Nauruans had contact with Europeans from whaling ships and traders who replenished their supplies (such as fresh water) at Nauru. The islanders traded food for alcoholic toddy and firearms. The first Europeans to live on the island, starting perhaps in 1830, were Patrick Burke and John Jones, Irish convicts who had escaped from Norfolk Island, according to Paradise for Sale. Jones became "Nauru's first and last dictator," who killed or banished several other beachcombers who arrived later, until the Nauruans banished Jones from the island in 1841. The introduction of firearms and alcohol destroyed the peaceful coexistence of the 12 tribes living on the island. A 10-year internal war began in 1878 and resulted in a reduction of the population from 1,400 (1843) to around 900 (1888). Ultimately, alcohol was banned and some arms were confiscated. German protectorate In 1886 Germany was granted the island under the Anglo-German Declaration. The island was annexed by Germany in 1888 and incorporated into Germany's New Guinea Protectorate. Nauru was occupied on 16 April 1888 by German troops, which ended the civil war. On 1 October 1888 the German gunboat SMS Eber landed 36 men on Nauru. Accompanied by William Harris the German marines marched around the island and returned with the twelve chiefs, the white settlers and a Gilbertese missionary. The chiefs were kept under house arrest until the morning of 2 October, when the annexation ceremony began with the raising of the German flag. The Germans told the chiefs that they had to surrender all weapons and ammunition within 24 hours or the chiefs would be taken prisoner. By the morning of 3 October 765 guns and 1,000 rounds of ammunition had been turned over. The Germans called the island Nawodo or Onawero. The arrival of the Germans ended the war, and social changes brought about by the war established kings as rulers of the island, the most widely known being King Auweyida. Christian missionaries from the Gilbert Islands also arrived in 1888. The Germans ruled Nauru for almost three decades. Robert Rasch, a German trader who married a native woman, was the first administrator, appointed in 1888. At the time there were twelve tribes on Nauru: Deiboe, Eamwidamit, Eamwidara, Eamwit, Eamgum, Eano, Emeo, Eoraru, Irutsi, Iruwa, Iwi and Ranibok. Today the twelve tribes are represented by the twelve-pointed star in the flag of Nauru. Phosphate was discovered on Nauru in 1900 by the prospector Albert Ellis. The Pacific Phosphate Company started to exploit the reserves in 1906 by agreement with Germany. The company exported its first shipment in 1907. World War I to World War II In 1914, following the outbreak of World War I, Nauru was captured by Australian troops, after which Britain held control until 1920. Australia, New Zealand, and the United Kingdom signed the Nauru Island Agreement in 1919, creating a board known as the British Phosphate Commission (BPC). This took over the rights to phosphate mining. According to the Commonwealth Bureau of Census and Statistics (now the Australian Bureau of Statistics), "In common with other natives, the islanders are very susceptible to tuberculosis and influenza, and in 1921 an influenza epidemic caused the deaths of 230 islanders." In 1923, the League of Nations gave Australia a trustee mandate over Nauru, with the United Kingdom and New Zealand as co-trustees. In 1932, the first Angam Baby was born. World War II During World War II, Nauru was subject to significant damage from both Axis (German and Japanese) and Allied forces. On 6 and 7 December 1940 the Nazi German auxiliary cruisers Orion and Komet sank four merchant ships. On the next day, Komet shelled Nauru's phosphate mining areas, oil storage depots, and the shiploading cantilever. The attacks seriously disrupted phosphate supplies to Australia and New Zealand (mostly used for munition and fertiliser purposes.) Japanese troops occupied Nauru on 26 August 1942, and executed 7 Europeans. The native Nauruans were badly treated by the occupying forces. On one occasion thirty nine leprosy sufferers were reputedly loaded onto boats which were towed out to sea and sunk. The Japanese troops built 2 airfields on Nauru which were bombed for the first time on 25 March 1943, preventing food supplies from being flown to Nauru. In 1943 the Japanese deported 1,200 Nauruans to | islands. Nauru was finally set free from the Japanese on 13 September 1945, when Captain Solda, the commander of all the Japanese troops on Nauru, surrendered the island to the Royal Australian Navy and Army. This surrender was accepted by Brigadier J. R. Stevenson, who represented Lieutenant General Vernon Sturdee, the commander of the First Australian Army, on board the warship HMAS Diamantina Arrangements were made to repatriate from Chuuk the 745 Nauruans who survived Japanese captivity there. They were returned to Nauru by the BPC ship Trienza in on 1 January 1946. Trust Territory In 1947, a trusteeship was established by the United Nations, and Australia, New Zealand, and the United Kingdom became the U.N. trustees of the island, although practical administration was mostly handled by Australia. By 1965 the population reached 5,561, of which just under half were considered Nauruan. In July 1966 the Nauruan Head Chief spoke at the United Nations Trusteeship Council, calling for independence by 31 1968. This was supported by the General Assembly in December of that year. Australia and the other administering powers sought to arrange an alternative to independence, such as internal self-government similar to that of the West Indies Associated States, or with Australia retaining a role in foreign affairs. Under these envisioned solutions, the resulting political settlement would be permanent, with no route to independence. This was due to concern over implications for other Pacific territories, and for the implication of such a small community (the size of an "English village") gaining the full trappings of statehood. These suggestions were however rejected by Nauru, and Australia was concerned that even if they were accepted by Nauru, they might not be accepted by the UN. In June 1967 it was agreed that assets belonging to the British Phosphate Commission on the island would be sold to Nauru for 21 million Australian dollars. Nauru was granted unconditional independence on 31 January 1968. Independence Nauru became self-governing in January 1966. On 31 January 1968, following a two-year constitutional convention, Nauru became the world's smallest independent republic. It was led by founding president Hammer DeRoburt. In 1967, the people of Nauru purchased the assets of the British Phosphate Commissioners, and in June 1970, control passed to the locally owned Nauru Phosphate Corporation. Money gained from the exploitation of phosphate was put into the Nauru Phosphate Royalties Trust and gave Nauruans the second highest GDP Per Capita (second only to the United Arab Emirates) and one of the highest standards of living in the Third World. In 1989, Nauru took legal actions against Australia in the International Court of Justice over Australia's actions during its administration of Nauru. In particular, Nauru made a legal complaint against Australia's failure to remedy the environmental damage caused by phosphate mining. Certain Phosphate Lands: Nauru v. Australia led to an out-of-court settlement to rehabilitate the mined-out areas of Nauru. By the close of the twentieth century, the finite phosphate supplies were fast running out. Nauru finally joined the UN in 1999. Modern-day Nauru As its phosphate stores began to run out (by 2006, its reserves were exhausted), the island was reduced to an environmental wasteland. Nauru appealed to the International Court of Justice to compensate for the damage from almost a century of phosphate strip-mining by foreign companies. In 1993, Australia offered Nauru an out-of-court settlement of A$2.5 million annually for 20 years. New Zealand and the UK additionally agreed to pay a one-time settlement of $12 million each. Declining phosphate prices, the high cost of maintaining an international airline, and the government's financial mismanagement combined to make the economy collapse in the late 1990s. By the new millennium, Nauru was virtually bankrupt. In December 1999, four major United States banks banned dollar transactions with four Pacific island states, including Nauru. The United States Department of State issued a report identifying Nauru as a major money laundering centre, used by narcotics traffickers and Russian organized crime figures. President Bernard Dowiyogo took office in April 2000 for his fourth and, after a minimal hiatus, fifth stints as Nauru's top executive. Dowiyogo first served as president from 1976 to 1978. He returned to that office in 1989, and was re-elected in 1992. A vote in parliament, however, forced him to yield power to Kinza Clodumar in 1995. Dowiyogo regained the presidency when the Clodumar government fell in mid-1998. In 2001, Nauru was brought to world attention by the Tampa affair, a Norwegian cargo ship at the centre of a diplomatic dispute between Australia, Norway and Indonesia. The ship carried asylum seekers, hailing primarily from Afghanistan, who were rescued while attempting to reach Australia. After much debate many of the immigrants were transported to Nauru, an arrangement known in Australia as the "Pacific Solution". Shortly thereafter, the Nauruan government closed its borders to most international visitors, preventing outside observers from monitoring the refugees' condition. In December 2003, several dozen of these refugees, in protest of the conditions of their detention on Nauru, began a hunger strike. The hunger strike was concluded in early January 2004 when an Australian medical team agreed to visit the island. Since then, according to recent reports, all but two of the refugees have been allowed into Australia. During 2002 Nauru severed diplomatic recognition with Taiwan (Republic of China) and signed an agreement to establish diplomatic relations with the People's Republic of China. This move followed China's promise to provide more than U.S. $130 million in aid. In 2004, Nauru broke off relations with the PRC and re-established them with the ROC. Nauru was also approached by the U.S. with a deal to modernize Nauru's infrastructure in exchange for suppression of the island's lax banking laws that allow activities that are illegal in other countries to flourish. Under this deal, allegedly, Nauru would also establish an embassy in China and perform certain "safehouse" and courier services for the U.S. government, in a scheme codenamed "Operation Weasel". Nauru agreed to the deal and instituted banking reform, but the U.S. later denied knowledge of the deal. The matter is being pursued in an Australian court, and initial judgments have been in favor of Nauru. |
The original limestone has been dolomitised by magnesium from seawater. The coral was raised above sea level about 30 metres and is now a dolomite limestone outcrop which was eroded in classic karst style into pinnacles up to 20 metres high. To at least a depth of 55 metres below sea level, the limestone has been dissolved forming cavities, sinkholes, and caves. Holes on the topside of the island were filled up by a phosphate layer up to several metres thick. Anibare Bay was formed by the underwater collapse of the east side of the volcano. Buada Lagoon was formed by solution of the limestone when the sea level was lower, followed by collapse. Nauru is moving at per year to the northwest along with the Pacific Plate. Freshwater can be found in Buada lagoon, and also in some brackish ponds at the escarpment base in Ijuw and Anabar in the northeast. There is an underground lake in Moqua Cave in the southeast of the island. Since there are no streams or rivers on Nauru, water must be gathered from roof catchment systems. Water is also brought to Nauru as ballast on ships returning for loads of phosphate. Environmental issues Periodic droughts, limited natural freshwater resources (roof storage tanks collect rainwater, but mostly dependent on a single, aging desalination plant) Extreme soil conditions are caused by high alkalinity, high phosphate levels, and low potassium. Iron, manganese, copper, molybdenum, and zinc are rendered unavailable to plants. Combined with thin or damaged soils this causes low fertility. Intensive phosphate mining during the past 90 years has left the central 80% of Nauru a wasteland and threatens limited remaining land resources. Nauru is a party to the international environmental agreements on biodiversity, climate change, desertification, the law of the sea and marine dumping. Climate Nauru's climate is hot and very humid year-round | to sea level and a coral atoll grew on top to a thickness of about 500 metres. Coral near the surface has been dated from 5 Mya to 0.3 Mya. The original limestone has been dolomitised by magnesium from seawater. The coral was raised above sea level about 30 metres and is now a dolomite limestone outcrop which was eroded in classic karst style into pinnacles up to 20 metres high. To at least a depth of 55 metres below sea level, the limestone has been dissolved forming cavities, sinkholes, and caves. Holes on the topside of the island were filled up by a phosphate layer up to several metres thick. Anibare Bay was formed by the underwater collapse of the east side of the volcano. Buada Lagoon was formed by solution of the limestone when the sea level was lower, followed by collapse. Nauru is moving at per year to the northwest along with the Pacific Plate. Freshwater can be found in Buada lagoon, and also in some brackish ponds at the escarpment base in Ijuw and Anabar in the northeast. There is an underground lake in Moqua Cave in the southeast of the island. Since there are no streams or rivers on Nauru, water must be gathered from roof catchment systems. Water is also brought to Nauru as ballast on ships returning for loads of phosphate. Environmental issues Periodic droughts, limited natural freshwater resources (roof storage tanks collect rainwater, but mostly dependent on a single, aging desalination plant) Extreme soil conditions are caused by high alkalinity, high phosphate levels, and low potassium. Iron, manganese, copper, molybdenum, and zinc are rendered unavailable to plants. Combined with thin or damaged soils this causes low fertility. Intensive phosphate mining during the past 90 years has left the central 80% of Nauru a wasteland and threatens limited remaining land resources. Nauru is a party to the international environmental agreements on biodiversity, climate change, desertification, the law of the sea and marine dumping. Climate Nauru's climate is hot and very humid year-round |
any ethnic group to become a citizen. The recent sizable immigration event of Chinese people happened in 1993. Languages The Nauruan language is the official language of Nauru. English is widely understood and is used for most government and commercial purposes. And is de facto official. According to the 2011 census, 95.3% of the population speaks Nauruan, 66.0% speak English, and 11.9% speak another language. Nauruan is an Austronesian language, however, no adequate written grammar of the language has been compiled, and its relationships to other Micronesian languages are not well understood.German is also common in Nauru because Nauru was previously part of German New Guinea. Religions The main religions in Nauru are Nauru Congregational (35.71%), Roman Catholic (32.96%), Assemblies of God (12.98%), and Nauru Independent (9.50%). The biggest changes from 2002 to 2011 were an increase from 0 to 1,291 (Assemblies of God) and 1,417 to 282 (Other). Public holidays include New Year's Day (1 January), Independence Day (31 January), Good Friday, Easter Monday, Easter Tuesday, Constitution Day (17 May), National Youth Day (25 September), Christmas Day, and Boxer Day. Nauruan Independent was the predominant religion in Nauru before the late nineteenth and early twentieth centuries, when foreign missionaries introduced Christianity to the island. It is still practised by 9.5% of the population, according to 2011 census. There are a few active Christian missionary organisations, including representatives of Anglicanism, Methodism, and Catholicism. The Constitution provides for freedom of religion; however, the Government restricted this right in some circumstances. The government has restricted the religious practices of The Church of Jesus Christ of Latter-day Saints and the Jehovah's Witnesses, most of whom are foreign workers employed by RONPhos. Education Literacy rate in Nauru, defined as "people who are currently enrolled in school and/or have reached at least grade 5 of primary education", is 96.5%, as of 2011. There are 3,190 students and 104 teachers, as of 2013. The 2011 census stated 4 percent of the population aged 15 years or older have a primary education, 91 percent have a secondary education, and 5 percent have a tertiary education. Education is based on the British system, which requires attendance from 5 to 16 years old. Students spend three years at preschool, six years of primary education, and four years of secondary school. Tertiary school is not required. An estimated 10.7 percent of the GDP was spent on education in 1999. As of 2013, there are five preschools, four primary schools, three secondary schools, and one tertiary school. The lone college, University of South Pacific, opened in the 1970s via distance courses, and in 1987 a campus was built on the island. It offers accounting, management, primary education, and English studies as majors. The education system had a near-collapse in 2000–2005. During this time, exams were not held, teachers were not paid, and schools did not have enough funding to continue. As a result, over half of the schools closed. In 2009, the Australian Government partnered with the Nauruan Department of Education to help. This agreement resulted in a 5.7% increase in students, teachers with a degree increased from 30% to 93%, and over A$11 million was used to construct a new secondary school. Health Economic indicators Net monthly income in 2006 averaged A$2,597 (A$ in 2014). In the same year, gross monthly income averaged A$9,554 (A$ in 2014). This was calculated during the mini-census of 2006, which featured 54.4% response rate of the population. The income was calculated using the following factors: first job salary, subsistence, other business income, second job salary, services to other households, benefits, house gifts consumed and received, and other income. Compared to other countries that use the Australian dollar—Kiribati, Australia, and Tuvalu—Nauru ranks number one in terms of income. Since 2013, Nauru does not have a minimum wage. Nauru's number of employed people has steadily risen and fallen. According to the 2011 census, there are 2,883 employed persons and 908 unemployed persons, making an unemployment rate of 23%. The Nauru Bureau of Statistics predicted the unemployment rate will decrease to 22% in FY2014/15. The gross domestic product of Nauru was A$69.55 million in 2009, an increase of 40% increase from 2008. The GDP is | life expectancy is 59.7 years. The population rose steadily from the 1960s until 2006 when the Government of Nauru repatriated thousands of Tuvaluan and I-Kiribati workers from the country. Since 1992, Nauru's birth rate has exceeded its death rate; the natural growth rate is positive. In terms of age structure, the population is dominated by the 15–64-year-old segment (65.6%). The median age of the population is 21.5, and the estimated gender ratio of the population is 0.91 males per one female. Nauru is inhabited mostly by Nauruans (93.6%), while minorities include I-Kiribati (1.8%), Chinese (1.5%) and other (3.1%). The demographic history of Nauru is marked by several migrations: the area was first inhabited by Micronesian people about 3,000 years ago. The first European to find the island was John Fearn in 1798. Then, the country was annexed by Germany in the 1888. The next major population change was when Japanese occupied the island during World War II in the 1942. During this time, the Japanese deported several thousands of Nauruans to other islands. In the 1960s, the country gained independence, where the percentage of Nauruans started to increase. The most recent demographic switch was in the 2000s, when the government repatriated several groups of non-Nauruans from the country. The Nauruan language is the official language of Nauru, but English is often used in the country. Nauruan is declared as the primary language of 95.3% of the population. The 2011 census revealed that 66.0% of the population spoke English and 11.9% another language. The main religions of Nauru are Nauru Congregational Church (35.71%) and Roman Catholic (32.96%). The literacy rate in Nauru is 96.5%. The proportion of the country's population aged 15 and over attaining academic degrees is one of the lowest in the world, reaching 7.9% in 2011. An estimated 10.7% of the gross domestic product (GDP) is spent on education. Nauru has a universal health care system, and in 2012, an estimated 7.5% of its GDP was spent on healthcare. Nauru has the highest obesity ranking in the world; 97 per cent of men and 93 per cent of women are obese. In 2006, the average net monthly income was A$2,597 (A$ in 2014). The most significant sources of employment are phosphate mining, banking industries, and various coconut products. In 2011, the unemployment rate was 23%. The 2011 census enumerated 1,647 total households, averaging 6.0 persons per house. Average urbanisation rate in Nauru is 100%. Population With a population of ten thousand in 2011, Nauru ranks around 230th in the world by population. Its population density is 478 inhabitants per square kilometre (185 per square mile). The overall life expectancy in Nauru at birth is 59.7 years. The total fertility rate of 3.70 children per mother is one of the highest in the Oceania. The United Nations projects the population will stay around 10,000 in the 2020s, and the Nauru Bureau of Statistics estimates the population will increase to 20,000 in 2038. In Nauru's history, there have been six major demographics changes. The island was first inhabited by Micronesian people roughly 3,000 years ago. The first European to find the island was John Fearn in 1798. In 1888, the country was annexed by Germany. The next demographic change came when Japanese occupied the island during World War II in the 1940s. During this time, the Japanese deported several thousands of Nauruans to other islands. The next major demographic change was in the 1960s; the country gained independence, and the percentage of Nauruans started to increase. The last major demographic change was in 2006 when the Government of Nauru repatriated almost all of the remaining Tuvaluan and I-Kiribati workers, following large scale reduction from the Republic of Nauru Phosphate Corporation (RONPhos) and government workers. The census of 2006 stated 9,233 people were in Nauru: down 2.13% per year from the previous census of 2002. From 2002 to 2011, there has been negative net migration, with an annual 109 net emigrants from 2006 to 2011. In 2009 there were 1,820 arrivals and 1,736 departures, for a positive rate of 84 immigrants. This was the first time since collecting data in 2002, there was a positive rate. Data on arrivals and departures collected by the Nauruan Customs and Immigration Office is not available, so specific immigration data is unavailable. As of the 2011 census, 57% of the population over 15 years old were legally or de facto married, 35% |
course of recent years, however, offshore banking institutions and instruments have come under increasing scrutiny by international bodies seeking to make international finance a more transparent system. Nauru, as a result, has been a casualty of this movement. In December 1999, four major international banks banned dollar transactions with Nauru. The United States Department of State issued a report identifying Nauru as a major money laundering centre, used by narcotics traffickers and organized crime figures. Shifting governments Nauru had 17 changes of administration between 1989 and 2003. Bernard Dowiyogo died in office in March 2003 and Ludwig Scotty was elected as the president, later being re-elected to serve a full term in October 2004. Following a vote of no confidence on 19 December 2007, Scotty was replaced by Marcus Stephen. Stephen resigned in November 2011, and Freddie Pitcher became president. Sprent Dabwido then filed a motion of no confidence in Pitcher, resulting in him becoming president. Following parliamentary elections in 2013, Baron Waqa was elected president. He held the presidential title six years from 2013 to 2019. President Waqa was a strong supporter of Australia keeping refugees in a refugee camp on Nauru soil. He lost his parliamentary seat in the 2019 Nauruan parliamentary election, meaning he could not be re-elected. In August 2019 the parliament elected former human rights lawyer Lionel Aingimea as the new President of Nauru. President Bernard Dowiyogo took office in April 2000 for his fourth and, after a minimal hiatus, fifth stints as Nauru's top executive. Dowiyogo first served as president from 1976 to 1978. He returned to that office in 1989, and was re-elected in 1992. A vote in parliament, however, forced him to yield power to Kinza Clodumar in 1995. Dowiyogo regained the presidency when the Clodumar government fell in mid-1998. In April 2000, René Harris, former chairman of the Nauru Phosphate Corporation, became president as he briefly assembled support in parliament. Harris' attempt to put together an administration lasted for only a few days of parliamentary maneuvering. In the end, Harris proved unable to secure parliament's confidence, and Dowiyogo returned yet again to the presidency by the end of the month. Rene Harris was finally able to claim power as the president of Nauru in March 2001 when he was elected to the presidency by the parliament; his term was to last three years, presumably ending in 2004. Environmental concerns Phosphate depletion will likely be one of the most important considerations for the government in the next few years as the supply is forecast to be exhausted by 2003. Since Nauru imports almost everything it consumes (including food, water and fuel) the need to diversify the economy and to generate other sources of revenue is of paramount importance. As noted above, offshore banking has been one arena into which Nauru has traversed, however, the rewards are limited by growing concern about the ethical parameters of this business. Tourism is another industry that is also being gradually built. Yet another concern is the ecological damage that resulted from a century of phosphate mining. Along with the United Kingdom, Australia and New Zealand were responsible for the large scale and indiscriminate mining of phosphate on the tiny island for most of the 20th century. The mining left an ecological and economic disaster for Nauru to handle when the country achieved independence in 1968. Not only was the country's principal resource and employment generating activity almost entirely depleted by the rapid mining done by the three countries, the mining companies had also failed to follow the basic principles of restoring and regenerating the lands where mining had been completed. Thus, Nauru was left to handle the immense and expensive task of restoring large chunks of land which were destroyed by the mining. Nauru demanded compensation from the three nations, but was refused. Finally, in 1993, Nauru was forced to turn to the International Court of Justice at The Hague in The Netherlands. It filed a claim of $73 million against the three countries. The case was soon afterwards settled out of court by Australia, with Britain and New Zealand also contributing to the reparations sought by Nauru. Today, Nauru is almost totally dependent on trade with New Zealand, Australia and Fiji. Arable land is very limited as are all other natural resources, now that its long-time economic base of phosphate mines has been almost completely depleted. Foreign policy On the international front, in late July 2002, Taiwan cut its diplomatic ties with Nauru. Taiwan and Nauru had shared diplomatic ties for 22 years; Taiwan has enjoyed diplomatic ties with several Pacific countries even in the face of the "One China policy" by Beijing. Nevertheless, this particular 22-year-long legacy was broken when Nauru's president decided to change its allegiance and establish formal relations with China. The move effectively shifted diplomatic recognition from Taipei to Beijing, thus angering the government of Taiwan, which described the shift in policy as "reckless." Nauru's decision to recognize Beijing via the signing of diplomatic papers and a joint communique ultimately resulted in the cessation of Taiwanese aid. Nauru instead received a US$150 million aid package from Beijing. In April 2005, during a state visit to the Marshall Islands, President Chen Shui-bian of Taiwan met and spoke with the Nauruan President Ludwig Scotty. On 14 May 2005, the two countries signed the necessary documents to restore formal ties and reopen embassies. The People's Republic of China consequently severed ties two weeks later on 31 May. Internal disputes In early 2003, a fight for power emerged between President Rene Harris and former President Bernard Dowiyogo. The power struggle occurred following a non-confidence vote in parliament, which effectively ejected Harris from the position of president. Reports suggested that Harris was ousted because of rising anxieties regarding economic mismanagement. At the time, Dowiyogo referred to Nauru's political scenario as being "critical." It was reported that Dowiyogo became the president replacing Harris, however, information surrounding the shift in power was sparse. There was very little international coverage of the matter. Regardless, Dowiyogo's tenure did not last for long. In March 2003, Dowiyogo had heart surgery in the United States and died. 2003 – present In May 2003, elections were held within the parliament to select a new president. In those | envoys to help Nauru deal with its financial crisis. By August 2004, a report by the Australian Centre for Independent Studies suggested that Nauru might consider relinquishing its independent status in favor of becoming an Australian territory. The report called for radical economic reform as well as the restructuring of both governmental instruments and public service. The author of the report has offered Nauru economic advice in the past. Scotty was re-elected to serve a full term in October 2004. Following a vote of "no confidence" by Parliament against President Scotty on 19 December 2007, Marcus Stephen became the President. Following Stephen's resignation in November 2011, Freddie Pitcher became President. Sprent Dabwido then moved a motion of no confidence in Pitcher, and Dabwido was duly elected President by the parliament, with nine votes supporting his nomination and eight votes opposing. Elections for Parliament were held in 2013, after which Baron Waqa was elected by Parliament as President. He held the presidential title six years from 2013 to 2019. President Waqa was a strong supporter of Australia keeping refugees in a refugee camp on Nauru soil. The incumbent president lost his parliamentary seat in 2019 Nauruan parliamentary election, meaning he lost his bid for re-election. In August 2019 the parliament elected former human rights lawyer Lionel Aingimea as the new President of Nauru. Crackdowns on Opposition politicians In January 2014, Nauru's President Baron Waqa fired the country's only magistrate Peter Law and its Chief Justice Geoffrey Eames (both Australian-based justices). Eames, himself, was fired after issuing an injunction to temporarily halt Law's deportation. In May and June 2014, Waqa suspended 5 of the 7 members of Nauru's Opposition from Parliament indefinitely. Three of the MPs, Mathew Batsiua, Kieren Keke and Roland Kun, were suspended in May 2014 for making comments to international media critical of the government and the alleged breakdown of the rule of law. Another two, Sprent Dabwido (a former president) and Squire Jeremiah were suspended a month later for behaving in an unruly manner. In June 2015, Jeremiah, Dabwido, and Batsiua were arrested and Kun had his passport cancelled amid claims that they had been trying to destabilize the Government by talking to foreign media. Executive branch |President of Nauru |Lionel Aingimea |Nonpartisan |27 August 2019 |} The Parliament elects a president from amongst its members, who appoints a Cabinet of 5–6 people. The President is both the head of state and head of government. A series of no-confidence votes, resignations and elections between 1999 and 2003 saw René Harris and Bernard Dowiyogo as President for numerous short periods during a period of political instability. Dowigoyo died in office on 10 March 2003, in Washington, D.C., after heart surgery. Ludwig Scotty was elected President on 29 May 2003, but this did not bring to an end the years of political uncertainty as he was replaced by Harris a few months later. Scotty regained the presidency in 2004, only to be ousted in a vote of no confidence in 2007. Legislative branch Parliament has 19 members, elected for a three-year term in multi-seat constituencies. Each constituency returns 2 members to the Nauruan Parliament, except for Ubenide which returns 4. Voting is compulsory for all citizens aged 20 or more. Political parties and elections Nauru does not have a formal structure for political parties; candidates typically stand as independents. 15 of the 18 members of the current parliament are independents, and alliances within the government are often formed on the basis of extended family ties. Four parties that have been active in Nauruan politics are the Nauru Party, the Democratic Party, Nauru First and the Centre Party. Judicial branch For its size, Nauru has a complex legal system. The Supreme Court, headed by the Chief Justice, is paramount on constitutional issues. Other cases can be appealed to the two-judge Appellate Court. Parliament cannot overturn court decisions, but Appellate Court rulings can be appealed to the High Court of Australia; in practice, this rarely happens. Lower courts consist of the District Court and the Family Court, both of which are headed by a Resident Magistrate, who also is the Registrar of the Supreme Court. Finally, there also are two quasi-courts: the Public Service Appeal Board and the Police Appeal Board, both of which are presided over by the Chief Justice. Local government Since 1992, local government has been the responsibility of the Nauru Island Council (NIC). The NIC has limited powers and functions as an advisor to the national government on local matters. The role of the NIC is to concentrate its efforts on local activities relevant |
in the early 2000s and the licenses of all offshore banks were revoked by the Nauru government in 2004. Nauru uses the Australian dollar for its currency. Most government payments are executed through electronic funds transfer. Electronic funds transfer at point of sale was introduced in 2020. The government is required to periodically fly in Australian currency to maintain liquidity. On 2 June 2015, an agency of Bendigo and Adelaide Bank, Australia's fifth largest bank, was established in Nauru by the Department of Finance. Effective from the end of April 2016, Westpac, one of Australia's largest banks, ceased having any dealings with the Nauru government. On 21 April 2016, it was announced that the Bendigo Bank was facing pressure to also close its operation in Nauru. As of September 2020, the agency of Bendigo and Adelaide Bank continued to operate in Nauru. Taxation On October 1, 2014, an income tax was imposed in Nauru for the first time, with high income earners paying a flat rate of 10%. The government spending in 2015 was forecast to be under US$92 million. Taxes include an airport departure tax and a bed tax at the Meneñ Hotel. The 2007–08 Budget saw the increase of existing excises on cigarettes and duty on imports. A tax on sugary foods was also introduced, chiefly to help combat Nauru's diabetes epidemic. Tax haven status Historically Nauru was regarded as a tax haven due to the operation of its international financial centre, which offered amongst other things offshore banking services. In 2001, Nauru was blacklisted internationally over concerns it had become a haven for money laundering. Amendments made in 2004 abolished Nauru's Offshore Banking sector and, as recognised in Nauru's latest anti money laundering and countering the financing of terrorism (AML/CFT) review, Nauru's offshore sector is now limited to a small offshore company register. In July 2017 the Organisation for Economic Co-operation and Development (OECD) upgraded its rating of Nauru's standards of tax transparency. Nauru had been listed alongside fourteen other countries that had failed to show that they could comply with international tax transparency standards and regulations. The OECD subsequently put Nauru through a fast-tracked compliance process and the country was given a "largely compliant" rating. Relationship with Australia Currently, Nauru is heavily dependent on Australia as its major source of financial support. In 2001 Nauru signed an agreement with Australia to accommodate asylum seekers (mostly from Iraq and Afghanistan) on the island, in return for millions of | significantly since 2012, with help from the reopening of the Nauru Regional Processing Centre, funded by Australia. The most recent 2020-21 Nauru Budget indicates a small economic contraction due to COVID-19 and a reduction in activity related to the Regional Processing Centre. In 2020-21, an estimated $210.5 million in revenues and $210.4 million in expenditures is expected. Balance of payments Phosphate is Nauru's only export product, although the government also receives relatively significant foreign exchange income from licensing its rich skipjack tuna fishing grounds to foreign fishing vessels, which land an annual average of 50,000 tonnes of Nauru zone-caught tuna overseas. In 2004 income from phosphate export was US$640,000, with Australia, New Zealand and Japan serving as the country's major export markets. In the same year the Nauru government budget shows that income from licensing foreign fishing vessels was over US$3,000,000. Nauru needs to import almost all basic and capital goods, including food, water, fuel, and manufactured goods, with Australia and New Zealand as its major import sources. In 2004 Nauru's imports totaled about US$19.8 million. Finance Nauru has been a cash economy since at least 2004, after the Bank of Nauru and the Republic of Nauru Finance Corporation went bankrupt and ceased operations in the early 2000s and the licenses of all offshore banks were revoked by the Nauru government in 2004. Nauru uses the Australian dollar for its currency. Most government payments are executed through electronic funds transfer. Electronic funds transfer at point of sale was introduced in 2020. The government is required to periodically fly in Australian currency to maintain liquidity. On 2 June 2015, an agency of Bendigo and Adelaide Bank, Australia's fifth largest bank, was established in Nauru by the Department of Finance. Effective from the end of April 2016, Westpac, one of Australia's largest banks, ceased having any dealings with the Nauru government. On 21 April 2016, it was announced that the Bendigo Bank was facing pressure to also close its operation in Nauru. As of September 2020, the agency of Bendigo and Adelaide Bank continued to operate in Nauru. Taxation On October 1, 2014, an income tax was imposed in Nauru for the first time, with high income earners paying a flat rate of 10%. The government spending in 2015 was forecast to be under US$92 million. Taxes include an airport departure tax and a bed tax at the Meneñ Hotel. The 2007–08 Budget saw the increase of existing excises on cigarettes and duty on imports. A tax on sugary foods was also introduced, chiefly to help combat Nauru's diabetes epidemic. Tax haven status Historically Nauru was regarded as a tax haven due to the operation of its international financial centre, which offered amongst other things offshore banking services. In 2001, Nauru was blacklisted internationally over concerns it had become a haven for money laundering. Amendments made in 2004 abolished Nauru's Offshore Banking sector and, as recognised in Nauru's latest anti money laundering and countering the financing of terrorism (AML/CFT) review, Nauru's offshore sector is now limited to a small offshore company register. In July 2017 the Organisation for Economic Co-operation and Development (OECD) upgraded its rating of Nauru's standards of tax transparency. Nauru had been listed alongside fourteen other countries that had failed to show that they could comply with international tax transparency standards and regulations. The OECD subsequently put Nauru through a fast-tracked compliance process and the country was given a |
system, with the format of numbers being +0067433724411. Telephone ranges In August 2011, Criden Appi, the Director of Telecommunications (Regulatory), said that Nauru advises "only 556xxxx, 557xxxx, 558xxxx are in use for mobiles and there are no landlines in service". In the ranges, X=0-9, and Y=0-9. Mobile telephone number ranges Fixed line area codes Special Numbers Radio and television As of 1998, there was 1 FM station, and no shortwave or AM stations. The FM station, Radio Nauru FM 105, is owned by the government. In 1997, there were 7,000 radios. As of that year, there were 500 television sets. There are two television stations. One station is government-owned and mainly rebroadcasts CNN, while the other is a private sports network. Internet The country's ccTLD is .nr. Internet service in the country is provided by CenPacNet. Domains must be paid, and can be ordered from CenPacNet. The original configuration of the .nr TLD domain was performed by Shaun Moran of Australian ComTech Communications in 1998 as part of the first Internet connectivity on the island. There was a | via Australian facilities. There is one satellite earth station, provided by Intelsat. Telephone numbers in Nauru The country code is +674, and the international call prefix is 00. There are seven other numbers in the system, with the format of numbers being +0067433724411. Telephone ranges In August 2011, Criden Appi, the Director of Telecommunications (Regulatory), said that Nauru advises "only 556xxxx, 557xxxx, 558xxxx are in use for mobiles and there are no landlines in service". In the ranges, X=0-9, and Y=0-9. Mobile telephone number ranges Fixed line area codes Special Numbers Radio and television As of 1998, there was 1 FM station, and no shortwave or AM stations. The FM station, Radio Nauru FM 105, is owned by the government. |
to the airport. There are five aeroplanes in service. Rail Rail transport is used for moving phosphate from the island's interior to the cantilever jetties on the island's western coast, in Aiwo District. For this purpose, a 3,900 m long, 0.61 m narrow gauge railway was built | There are five aeroplanes in service. Rail Rail transport is used for moving phosphate from the island's interior to the cantilever jetties on the island's western |
Railway tracks eventually extended inland. Hauling guano by muscle-power in the fierce tropical heat, combined with general disgruntlement with conditions on the island, eventually provoked a rebellion in 1889, in which five supervisors died. A U.S. warship returned 18 of the workers to Baltimore for three separate trials on murder charges. A black fraternal society, the Order of Galilean Fishermen, raised money to defend the miners in federal court, and the defense built its case on the contention that the men acted in self-defense or in the heat of passion, and that the United States did not have jurisdiction over the island. E. J. Waring, the first black lawyer to pass the Maryland bar, was a part of the defense's legal team. The cases, including Jones v. United States, , went to the U.S. Supreme Court in October 1890, which ruled the Guano Act constitutional, and three of the miners were scheduled for execution in the spring of 1891. A grass-roots petition driven by black churches around the country, also signed by white jurors from the three trials, reached President Benjamin Harrison, who commuted the sentences to imprisonment and mentioned the case in a State of the Union Address. Guano mining resumed on Navassa at a much reduced level. The Spanish–American War of 1898 forced the Phosphate Company to evacuate the island and file for bankruptcy, and the new owners abandoned the island after 1901. 1901 to present In 1905, the U.S. Lighthouse Service identified Navassa Island as a good location for a new lighthouse. However, plans for the light moved slowly. With the opening of the Panama Canal in 1914, shipping between the American eastern seaboard and the Canal through the Windward Passage between Cuba and Haiti increased in the area of Navassa, which proved a hazard to navigation. The Lighthouse Service finally built Navassa Island Light, a tower on the island in 1917, above sea level. At the same time, a wireless telegraphy station was established on the island. A keeper and two assistants were assigned to live there until the Lighthouse Service installed an automatic beacon in 1929. After absorbing the Lighthouse Service in 1939, the U.S. Coast Guard serviced the light twice each year. The U.S. Navy set up an observation post for the duration of World War II. The island has been uninhabited since then. Fishermen, mainly from Haiti, fish the waters around Navassa. A scientific expedition from Harvard University studied the land and marine life of the island in 1930. After World War II amateur radio operators occasionally visited to operate from the territory, which is accorded "entity" (country) status by the American Radio Relay League. The callsign prefix is KP1. From 1903 to 1917, Navassa was a dependency of the U.S. Guantanamo Bay Naval Base, and from 1917 to 1996, it was under United States Coast Guard administration. In 1996, the Coast Guard dismantled the light on Navassa, which ended its interest in the island. Consequently, the Department of the Interior assumed responsibility for the civil administration of the area, and placed the island under its Office of Insular Affairs. For statistical purposes, Navassa was grouped with the now-obsolete term United States Miscellaneous Caribbean Islands and is now grouped with other islands claimed by the U.S. under the Guano Islands Act as the United States Minor Outlying Islands. In 1997, an American salvager made a claim to Navassa to the Department of State based on the Guano Islands Act. On March 27, 1997, the Department of the Interior rejected the claim on the basis that the Guano Islands Act applies only to islands which, at the time of the claim, are not "appertaining to" the United States. The department's opinion said that Navassa is and remains a U.S. possession "appertaining to" the United States and is "unavailable to be claimed" under the Guano Islands Act. A 1998 scientific expedition led by the Center for Marine Conservation in Washington, D.C., described Navassa as "a unique preserve of Caribbean biodiversity." The island's land and offshore ecosystems have survived the 20th century virtually untouched. In September 1999, the United States Fish and Wildlife Service established the Navassa Island National Wildlife Refuge, which encompasses of land and a 12 nautical mile (22.2 km) radius of marine habitat around the island. Later that year, full administrative responsibility for Navassa was transferred from the Office of Insular Affairs to the U.S. Fish and Wildlife Service. Due to hazardous coastal conditions and for preservation of species habitat, the refuge is closed to the general public, and visitors need permission from the Fish and Wildlife Service to enter its territorial waters or land. Since it became a National Wildlife Refuge, amateur radio operators have repeatedly been denied entry. In October 2014, permission was granted for a two-week DX-pedition in February 2015. The operation made 138,409 contacts. Geography, topography and ecology Navassa Island is about in area. It is located west of Haiti's southwest peninsula, south of the U.S. naval base at Guantanamo Bay, Cuba, and about one-quarter of the way from mainland Haiti to Jamaica in the Jamaica Channel. Navassa reaches an elevation of at Dunning Hill south of the lighthouse, Navassa Island Light. This location is from the southwestern coast or east of Lulu Bay. The terrain of Navassa Island consists mostly of exposed coral and limestone, the island being ringed by vertical white cliffs high, but with enough grassland to support goat herds. The island is covered in a forest of four tree species: short-leaf fig (Ficus populnea var. brevifolia), pigeon plum (Coccoloba diversifolia), mastic (Sideroxylon foetidissimum), and poisonwood | has claimed the island since 1857, based on the Guano Islands Act of 1856. Haiti's claim over Navassa goes back to the Treaty of Ryswick in 1697 that established French possessions in mainland Hispaniola, that were transferred from Spain by the treaty as well as other specifically named nearby islands. Its 1801 constitution claimed several nearby islands by name, among which Navassa was not enumerated, but also laid claim to "other adjacent islands", which Haiti maintains included Navassa. The U.S. claim to the island, first made in 1857, asserts that Navassa was not included among the unnamed “other adjacent islands” in the Haitian Constitution of 1801. Since the Haitian Constitution of 1874, Haiti has explicitly named "la Navase" as one of the territories it claims, and maintains that it has been claimed as part of Haiti continuously since 1801. Médéric Louis Élie Moreau de Saint-Méry, who was a member of the French Parliament best known for his publications on Saint-Domingue (now the Republic of Haiti), referred to la Navasse as the "small French island of Saint-Domingue" in 1798. History 1504 to 1901 In 1504, Christopher Columbus, stranded on Jamaica during his fourth voyage, sent some crew members by canoe to Hispaniola for help. They ran into the island on the way, but it had no water. They called it Navaza (from "nava-" meaning plain, or field), and it was avoided by mariners for the next 350 years. From 1801 to 1867, the successive constitutions of Haiti claimed national sovereignty over adjacent islands, both named and unnamed, although Navassa was not specifically enumerated until 1874. Navassa Island was also claimed for the United States on September 19, 1857, by Peter Duncan, an American sea captain, under the Guano Islands Act of 1856, for the rich guano deposits found on the island, and for not being within the lawful jurisdiction of any other government, nor occupied by another government's citizens. Haiti protested the annexation, but on July 7, 1858, U.S. President James Buchanan issued an Executive Order upholding the American claim, which also called for military action to enforce it. Navassa Island has since been maintained by the United States as an unincorporated territory (according to the Insular Cases). The United States Supreme Court on November 24, 1890, in Jones v. United States, 137 U.S. 202 (1890) Id. at 224 found that Navassa Island must be considered as appertaining to the United States, creating a legal history for the island under U.S. law unlike many other islands originally claimed under the Guano Islands Act. As listed in its 1987 constitution, Haiti maintains its claim to the island, which is considered part of the department of Grand'Anse. Guano mining and the Navassa Island Rebellion of 1889 Guano phosphate is a superior organic fertilizer that became a mainstay of American agriculture in the mid-19th century. In November 1857, Duncan transferred his discoverer's rights to his employer, an American guano trader in Jamaica, who sold them to the newly formed Navassa Phosphate Company of Baltimore. After an interruption for the American Civil War, the company built larger mining facilities on Navassa with barrack housing for 140 black contract laborers from Maryland, houses for white supervisors, a blacksmith shop, warehouses, and a church. Mining began in 1865. The workers dug out the guano by dynamite and pick-axe and hauled it in rail cars to the landing point at Lulu Bay, where it was put into sacks and lowered onto boats for transfer to the Company barque, the S.S. Romance. The living quarters at Lulu Bay were referred to as 'Lulu Town', as appears on old maps. Railway tracks eventually extended inland. Hauling guano by muscle-power in the fierce tropical heat, combined with general disgruntlement with conditions on the island, eventually provoked a rebellion in 1889, in which five supervisors died. A U.S. warship returned 18 of the workers to Baltimore for three separate trials on murder charges. A black fraternal society, the Order of Galilean Fishermen, raised money to defend the miners in federal court, and the defense built its case on the contention that the men acted in self-defense or in the heat of passion, and that the United States did not have jurisdiction over the island. E. J. Waring, the first black lawyer to pass the Maryland bar, was a part of the defense's legal team. The cases, including Jones v. United States, , went to the U.S. Supreme Court in October 1890, which ruled the Guano Act constitutional, and three of the miners were scheduled for execution in the spring of 1891. A grass-roots petition driven by black churches around the country, also signed by white jurors from the three trials, reached President Benjamin Harrison, who commuted the sentences to imprisonment and mentioned the case in a State of the Union Address. Guano mining resumed on Navassa at a much reduced level. The Spanish–American War of 1898 forced the Phosphate Company to evacuate the island and file for bankruptcy, and the new owners abandoned the island after 1901. 1901 to present In 1905, the U.S. Lighthouse Service identified Navassa Island as a good location for a new lighthouse. However, plans for the light moved slowly. With the opening of the Panama Canal in 1914, shipping between the American eastern seaboard and the Canal through the Windward Passage between Cuba and Haiti increased in the area of Navassa, which proved a hazard to navigation. The Lighthouse Service finally built Navassa Island Light, a tower on the island in 1917, above sea level. At the same time, a wireless telegraphy station was established on the island. A keeper and two assistants were assigned to live there until the Lighthouse Service installed an automatic beacon in |
by the Mahiṣapālavaṃśa or "buffalo-herder dynasty", established by a Yadav named Bhul Singh. The Shakya clan formed an independent oligarchic republican state known as the 'Śākya Gaṇarājya' during the late Vedic period (c. 1000 – c. 500 BCE) and the later so-called second urbanisation period (c. 600 – c. 200 BCE). Its capital was Kapilavastu, which may have been located either in present-day Tilaurakot, Nepal. Gautama Buddha (c. 6th to 4th centuries BCE), whose teachings became the foundation of Buddhism, was the best-known Shakya. He was known in his lifetime as "Siddhartha Gautama" and "Shakyamuni" (Sage of the Shakyas). He was the son of Śuddhodana, the elected leader of the Śākya Gaṇarājya. Kirat dynasty The context of Kirat Dynasty ruling in Nepal before Licchavi dynasty and after Mahispal (Ahir) dynasty are depicted in different manuscripts. Delineating the area between the Sun Koshi and Tama Koshi rivers as their native land, the list of Kirati kings is also given in the Gopal genealogy. The Mahisapala dynasty was a dynasty established by Abhira that ruled the Kathmandu Valley. They took control of Nepal after replacing the Gopala dynasty. Three kings of Mahisapala dynasty ruled the valley before they were overthrown by the Kirata dynasty. They were also known as Mahispalbanshi.By defeating the last king of the Avir dynasty Bhuwansingh in a battle, Kirati King Yalung or Yalamber had taken the regime of the valley under his control. In Hindu mythological perspective, this event is believed to have taken place in the final phase of Dvapara Yuga or initial phase of Kali Yuga or around the 6th century BC. Descriptions of 32, 28 and 29 Kirati kings are found according to the Gopal genealogy, language-genealogy and Wright genealogy respectively. By means of the notices contained in the classics of the East and West, the Kiranti people were living in their present whereabouts for the last 2000 to 2500 years, with an extensive dominion, possibly reaching at one time to the delta of the Ganges. Licchavi dynasty The kings of the Lichhavi dynasty (originally from Vaishali in modern-day India) ruled what is the Kathmandu valley in modern-day Nepal after the Kirats. It is mentioned in some genealogies and Puranas that the "Suryavansi Kshetriyas had established a new regime by defeating the Kirats". The Pashupati Purana mentions that "the masters of Vaishali established their own regime by confiding Kiratis with sweet words and defeating them in war". Similar contexts can be found in 'Himbatkhanda', which also mentions that "the masters of Vaishali had started ruling in Nepal by defeating Kirats". Different genealogies state different names of the last Kirati king. According to the Gopal genealogy, the Lichhavis established their rule in Nepal by defeating the last Kirati King 'Khigu', 'Galiz' according to the language-genealogy and 'Gasti' according to Wright genealogy. Medieval history Thakuri dynasty The Thakuri dynasty were Rajputs. After Aramudi, who is mentioned in the Kashmirian chronicle, the Rajatarangini of Kalhana (1150 CE), many Thakuri kings ruled over parts of the country up to the middle of the 12th century CE. Raghava Deva is said to have founded a ruling dynasty in 879 CE, when the Lichhavi rule came to an end. To commemorate this important event, Raghava Deva started the 'Nepal Era' which began on 20 October, 879 CE. After Amshuvarma, who ruled from 605 CE onward; the Thakuris had lost power and they could regain it only in 869 CE. Gunakama Deva, who ruled from 949 to 994 CE, commissioned the construction of a big wooden shelter, built from the wood of a single tree, called Kasthamandapa. The name of the capital, 'Kathmandu', is derived from this. Gunakama Deva founded the town Kantipur (modern-day Kathmandu). The tradition of Indra Jatra started during his reign. Bhola Deva succeeded Gunakama Deva. The next ruler was Laxmikama Deva who ruled from 1024 to 1040 CE. He built the Laksmi Vihara and introduced the tradition of worshiping the Kumari; young prepubescent girls believed to be manifestations of the divine female energy or devi. He was succeeded by his son, Vijayakama Deva, who introduced the worship of the Naga and Vasuki. Vijaykama Deva was the last ruler of this dynasty. After his death, the Thakuri clan of Nuwakot occupied the throne of Nepal. Bhaskara Deva, a Thakuri from Nuwakot, succeeded Vijayakama. He is said to have built Navabahal and Hemavarna Vihara. After Bhaskara Deva, four kings of this line ruled over the country. They were Bala Deva, Padma Deva, Nagarjuna Deva and Shankara Deva. Shankara Deva (1067–1080 CE) was the most illustrious ruler of this dynasty. He established the image of 'Shantesvara Mahadeva' and 'Manohara Bhagavati'. The custom of pasting the pictures of Nagas and Vasuki on the doors of houses on the day of Nagapanchami was introduced by him. During his rule, the Buddhists wreaked vengeance on the Hindu Brahmins (especially the followers of Shaivism) for the harm they had received earlier from the Shankaracharya. Shankara Deva tried to pacify the Brahmins harassed by the Buddhists. Bama Deva, a descendant of Amshuvarma, defeated Shankar Deva in 1080 CE. He suppressed the Nuwakot-Thankuris with the help of nobles and restored the old Solar Dynasty rule in Nepal for the second time. Harsha Deva, the successor of Bama Deva was a weak ruler. There was no unity among the nobles and they asserted themselves in their respective spheres of influence. Taking that opportunity Nanya Deva, a Karnat dynasty king, attacked Nuwakot from Simraungarh. The army successfully defended and won the battle. After Harsha Deva, Shivadeva the third ruled from 1099 to 1126 CE. He founded the town of Kirtipur and roofed the temple of Pashupatinath with gold. He introduced twenty-five paisa coins. After Sivadeva III, Mahendra Deva, Mana Deva, Narendra Deva II, Ananda Deva, Rudra Deva, Amrita Deva, Ratna Deva II, Somesvara Deva, Gunakama Deva II, Lakmikama Deva III and Vijayakama Deva II ruled Nepal in quick succession. Historians differ about the rule of several kings and their respective times. After the fall of the Thakuri dynasty, a new dynasty was founded by Arideva or Ari Malla, known as the 'Malla dynasty'. Malla dynasty Early Malla rule started with Ari Malla in the 12th century. Over the next two centuries, his kingdom expanded widely, into much of the Indian subcontinent and western Tibet, before disintegrating into small principalities, which later came to be known as the Baise Rajya. Jayasthiti Malla, with whom commences the later Malla dynasty of the Kathmandu valley, began to reign at the end of the 18th century. Malla dynasty was the longest ruling dynasty in Nepalese history, ruling from the 12th century to the 18th century (about 600 years). This era in the valley is eminent for the various social and economic reforms such as the 'Sanskritization' of the valley people, new methods of land measurement and allocation, etc. In this era, new forms of art and architecture was introduced. The monuments in Kathmandu valley which are listed in the UNESCO World Heritage Sites were built during Malla rule. In the 14th century, before Kathmandu was divided into 3 princely states, Araniko was sent to China upon the request of Abhaya Malla for representing the skill of art and architecture, and he introduced the Pagoda style of architecture to China and subsequently, whole of Asia. Yaksha Malla, the grandson of Jayasthiti Malla, ruled the Kathmandu valley until almost the end of the 15th century. After his demise, the valley was divided into three independent kingdoms—Kathmandu, Bhaktapur, and Patan—in about 1484 CE. This division led the Malla rulers into internecine clashes and wars for territorial and commercial gains. Mutually debilitating wars gradually weakened them, which facilitated the conquest of the valley by Prithvi Narayan Shah of Gorkha. The last Malla rulers were Jaya Prakash Malla, Teja Narasingha Malla and Ranjit Malla of Kathmandu, Patan, and Bhaktapur respectively. Simroun dynasty The Simroun, Simroon, Karnat or Dev dynasty originated with an establishment of a kingdom in 1097 CE headquartered at present-day Simroungarh in Bara district. The kingdom controlled the areas today known as Tirhoot or Mithila in Nepal and Bihar of India. The rulers of Simroungarh were as follows: Nanya Dev, ruled 1097-1147 CE Ganga Dev, ruled 1147-1187 CE Narsingh Dev, ruled 1187-1227 CE Ramsingh Dev, ruled 1227-1285 CE Shaktisingh Dev, ruled 1285-1295 CE Harisingh Dev, ruled 1295-1324 CE In 1324 CE, Ghiyasuddin Tughlaq attacked Simroungarh and demolished the fort. The remains are still scattered across the Simroungarh region. The king, Harisingh Dev, fled northwards where his son, Jagatsingh Dev, was married to the widowed princess of Bhaktapur, Nayak Devi. Shah dynasty, Unification of Nepal Prithvi Narayan Shah (c. 1768–1775) was the ninth generation descendant of Dravya Shah (1559–1570), the founder of the ruling house of Gorkha. Prithvi Narayan Shah succeeded his father Nara Bhupal Shah to the throne of Gorkha in 1743 CE. King Prithvi Narayan Shah was quite aware of the political situation of the valley kingdoms as well as of the Baise and Chaubise principalities. He foresaw the need for unifying the small principalities as an urgent condition for survival in the future and set himself to the task accordingly. His assessment of the situation among the hill principalities was correct, and the principalities were subjugated fairly easily. King Prithvi Narayan Shah's victory march began with the conquest of Nuwakot, which lies between Kathmandu and Gorkha, in 1744. After Nuwakot, he occupied strategic points in the hills surrounding the Kathmandu valley. The valley's communications with the outside world were thus cut off. The occupation of the Kuti Pass in about 1756 stopped the valley's trade with Tibet. Finally, Prithvi Narayan Shah entered the valley. After the victory in Kirtipur, King Jaya Prakash Malla of Kathmandu sought help from the British and the then East India Company sent a contingent of soldiers under Captain Kinloch in 1767. The British force was defeated in Sindhuli by the Gorkhali army. This defeat of the British completely shattered the hopes of King Jaya Prakash Malla. On 25 September 1768, as the people of Kathmandu were celebrating the festival of Indra Jatra, the Gorkhali army marched into the city. Prithvi Narayan Shah sat on a throne put on the palace courtyard for the king of Kathmandu, proclaiming himself the king. Jaya Prakash Malla somehow managed to escape and took asylum in Patan. When Patan was captured a few weeks later, both Jaya Prakash Malla and Tej Narsingh Malla, the king of Patan took refuge in Bhaktapur, which was captured on the night of 25 November 1769. The Kathmandu valley was thus conquered by King Prithvi Narayan Shah, who proclaimed himself King with Kathmandu as the royal capital of the Kingdom of Nepal. King Prithvi Narayan Shah was successful in bringing together diverse religio-ethnic groups under one rule. He was a true nationalist in his outlook and was in favor of adopting a closed-door policy with regards to the British. Not only his social and economic views guided the country's socio-economic course for a long time, his use of the imagery, 'a yam between two boulders' in Nepal's geopolitical context, formed the principal guideline of the country's foreign policy for future centuries. Modern history Kingdom of Nepal After decades of rivalry between the medieval kingdoms, modern Nepal was unified in the latter half of the 18th century, when Prithvi Narayan Shah, the | The next ruler was Laxmikama Deva who ruled from 1024 to 1040 CE. He built the Laksmi Vihara and introduced the tradition of worshiping the Kumari; young prepubescent girls believed to be manifestations of the divine female energy or devi. He was succeeded by his son, Vijayakama Deva, who introduced the worship of the Naga and Vasuki. Vijaykama Deva was the last ruler of this dynasty. After his death, the Thakuri clan of Nuwakot occupied the throne of Nepal. Bhaskara Deva, a Thakuri from Nuwakot, succeeded Vijayakama. He is said to have built Navabahal and Hemavarna Vihara. After Bhaskara Deva, four kings of this line ruled over the country. They were Bala Deva, Padma Deva, Nagarjuna Deva and Shankara Deva. Shankara Deva (1067–1080 CE) was the most illustrious ruler of this dynasty. He established the image of 'Shantesvara Mahadeva' and 'Manohara Bhagavati'. The custom of pasting the pictures of Nagas and Vasuki on the doors of houses on the day of Nagapanchami was introduced by him. During his rule, the Buddhists wreaked vengeance on the Hindu Brahmins (especially the followers of Shaivism) for the harm they had received earlier from the Shankaracharya. Shankara Deva tried to pacify the Brahmins harassed by the Buddhists. Bama Deva, a descendant of Amshuvarma, defeated Shankar Deva in 1080 CE. He suppressed the Nuwakot-Thankuris with the help of nobles and restored the old Solar Dynasty rule in Nepal for the second time. Harsha Deva, the successor of Bama Deva was a weak ruler. There was no unity among the nobles and they asserted themselves in their respective spheres of influence. Taking that opportunity Nanya Deva, a Karnat dynasty king, attacked Nuwakot from Simraungarh. The army successfully defended and won the battle. After Harsha Deva, Shivadeva the third ruled from 1099 to 1126 CE. He founded the town of Kirtipur and roofed the temple of Pashupatinath with gold. He introduced twenty-five paisa coins. After Sivadeva III, Mahendra Deva, Mana Deva, Narendra Deva II, Ananda Deva, Rudra Deva, Amrita Deva, Ratna Deva II, Somesvara Deva, Gunakama Deva II, Lakmikama Deva III and Vijayakama Deva II ruled Nepal in quick succession. Historians differ about the rule of several kings and their respective times. After the fall of the Thakuri dynasty, a new dynasty was founded by Arideva or Ari Malla, known as the 'Malla dynasty'. Malla dynasty Early Malla rule started with Ari Malla in the 12th century. Over the next two centuries, his kingdom expanded widely, into much of the Indian subcontinent and western Tibet, before disintegrating into small principalities, which later came to be known as the Baise Rajya. Jayasthiti Malla, with whom commences the later Malla dynasty of the Kathmandu valley, began to reign at the end of the 18th century. Malla dynasty was the longest ruling dynasty in Nepalese history, ruling from the 12th century to the 18th century (about 600 years). This era in the valley is eminent for the various social and economic reforms such as the 'Sanskritization' of the valley people, new methods of land measurement and allocation, etc. In this era, new forms of art and architecture was introduced. The monuments in Kathmandu valley which are listed in the UNESCO World Heritage Sites were built during Malla rule. In the 14th century, before Kathmandu was divided into 3 princely states, Araniko was sent to China upon the request of Abhaya Malla for representing the skill of art and architecture, and he introduced the Pagoda style of architecture to China and subsequently, whole of Asia. Yaksha Malla, the grandson of Jayasthiti Malla, ruled the Kathmandu valley until almost the end of the 15th century. After his demise, the valley was divided into three independent kingdoms—Kathmandu, Bhaktapur, and Patan—in about 1484 CE. This division led the Malla rulers into internecine clashes and wars for territorial and commercial gains. Mutually debilitating wars gradually weakened them, which facilitated the conquest of the valley by Prithvi Narayan Shah of Gorkha. The last Malla rulers were Jaya Prakash Malla, Teja Narasingha Malla and Ranjit Malla of Kathmandu, Patan, and Bhaktapur respectively. Simroun dynasty The Simroun, Simroon, Karnat or Dev dynasty originated with an establishment of a kingdom in 1097 CE headquartered at present-day Simroungarh in Bara district. The kingdom controlled the areas today known as Tirhoot or Mithila in Nepal and Bihar of India. The rulers of Simroungarh were as follows: Nanya Dev, ruled 1097-1147 CE Ganga Dev, ruled 1147-1187 CE Narsingh Dev, ruled 1187-1227 CE Ramsingh Dev, ruled 1227-1285 CE Shaktisingh Dev, ruled 1285-1295 CE Harisingh Dev, ruled 1295-1324 CE In 1324 CE, Ghiyasuddin Tughlaq attacked Simroungarh and demolished the fort. The remains are still scattered across the Simroungarh region. The king, Harisingh Dev, fled northwards where his son, Jagatsingh Dev, was married to the widowed princess of Bhaktapur, Nayak Devi. Shah dynasty, Unification of Nepal Prithvi Narayan Shah (c. 1768–1775) was the ninth generation descendant of Dravya Shah (1559–1570), the founder of the ruling house of Gorkha. Prithvi Narayan Shah succeeded his father Nara Bhupal Shah to the throne of Gorkha in 1743 CE. King Prithvi Narayan Shah was quite aware of the political situation of the valley kingdoms as well as of the Baise and Chaubise principalities. He foresaw the need for unifying the small principalities as an urgent condition for survival in the future and set himself to the task accordingly. His assessment of the situation among the hill principalities was correct, and the principalities were subjugated fairly easily. King Prithvi Narayan Shah's victory march began with the conquest of Nuwakot, which lies between Kathmandu and Gorkha, in 1744. After Nuwakot, he occupied strategic points in the hills surrounding the Kathmandu valley. The valley's communications with the outside world were thus cut off. The occupation of the Kuti Pass in about 1756 stopped the valley's trade with Tibet. Finally, Prithvi Narayan Shah entered the valley. After the victory in Kirtipur, King Jaya Prakash Malla of Kathmandu sought help from the British and the then East India Company sent a contingent of soldiers under Captain Kinloch in 1767. The British force was defeated in Sindhuli by the Gorkhali army. This defeat of the British completely shattered the hopes of King Jaya Prakash Malla. On 25 September 1768, as the people of Kathmandu were celebrating the festival of Indra Jatra, the Gorkhali army marched into the city. Prithvi Narayan Shah sat on a throne put on the palace courtyard for the king of Kathmandu, proclaiming himself the king. Jaya Prakash Malla somehow managed to escape and took asylum in Patan. When Patan was captured a few weeks later, both Jaya Prakash Malla and Tej Narsingh Malla, the king of Patan took refuge in Bhaktapur, which was captured on the night of 25 November 1769. The Kathmandu valley was thus conquered by King Prithvi Narayan Shah, who proclaimed himself King with Kathmandu as the royal capital of the Kingdom of Nepal. King Prithvi Narayan Shah was successful in bringing together diverse religio-ethnic groups under one rule. He was a true nationalist in his outlook and was in favor of adopting a closed-door policy with regards to the British. Not only his social and economic views guided the country's socio-economic course for a long time, his use of the imagery, 'a yam between two boulders' in Nepal's geopolitical context, formed the principal guideline of the country's foreign policy for future centuries. Modern history Kingdom of Nepal After decades of rivalry between the medieval kingdoms, modern Nepal was unified in the latter half of the 18th century, when Prithvi Narayan Shah, the ruler of the small principality of Gorkha, formed a unified country from a number of independent hill high states. After the death of Prithvi Narayan Shah, the Shah dynasty began to expand their kingdom into much of the Indian subcontinent. Between 1788 and 1791, during the Sino-Nepalese War, Nepal invaded Tibet and robbed Tashilhunpo Monastery in Shigatse. Alarmed, the Qianlong Emperor of the Chinese Qing Dynasty appointed Fuk'anggan commander-in-chief of the Tibetan campaign; Fuk'anggan signed a treaty to protect his troops thus attaining a draw. After 1800, the heirs of Prithvi Narayan Shah proved unable to maintain firm political control over Nepal. A period of internal turmoil followed. The rivalry between Nepal and the British East India Company over the princely states bordering Nepal and British-India eventually led to the Anglo-Nepalese War (1814–16), in which Nepal suffered substantial losses due to lack of guns and ammunitions against the British-Indian forces with advanced weapons. The Treaty of Sugauli was signed in 1816, ceding large parts of the Nepalese controlled territories to the British. In 1860 some parts of western Terai, known as Naya Muluk (new country) was restored to Nepal. The four noble families involved largely in the active politics of the kingdom were the Shah rulers, the Thapas, the Basnyats, and the Pandes before the rise of the Rana dynasty. From beginning to the mid of 18th century, the Thapas and Pandes had extreme dominance over Nepalese Darbar politics alternatively contesting for central power amongst each other. Rana rule Jung Bahadur Rana was the first ruler from this dynasty. Rana rulers were titled "Shree Teen" and "Maharaja", whereas Shah kings were "Shree Panch" and "Maharajadhiraja". Jung Bahadur codified laws and modernized the state's bureaucracy. In the coup d'état of 1846, the nephews of Jung Bahadur and Ranodip Singh murdered Ranodip Singh and the sons of Jung Bahadur, adopted the name of Jung Bahadur and took control of Nepal. Nine Rana rulers took the hereditary office of Prime Minister. All were styled (self proclaimed) Maharaja of Lamjung and Kaski. The Rana regime, a tightly centralized autocracy, pursued a policy of isolating Nepal from external influences. This policy helped Nepal maintain its independence during the British colonial era, but it also impeded the country's economic development and modernisation. The Ranas were staunchly pro-British and assisted the British during the Indian Rebellion of 1857 and later in both World Wars. At the same time, despite Chinese claims, the British supported Nepalese independence at the beginning of the twentieth century. In December 1923, Britain and Nepal formally signed a "treaty of perpetual peace and friendship" superseding the Sugauli Treaty of 1816 and upgrading the British resident in Kathmandu to an envoy. Slavery was abolished in Nepal in 1924 under premiership of Chandra Shamsher Jang Bahadur Rana. Following the German invasion of Poland, the Kingdom of Nepal declared war on Germany on September 4, 1939. Once Japan entered the conflict, sixteen battalions of the Nepali Army fought on the Burmese front. In addition to military support, Nepal contributed guns, equipment as well as hundreds of thousand of pounds of tea, sugar and raw materials such as timber to the Allied war effort. Revolution of 1951 The revolution of 1951 started when dissatisfaction against the family rule of the Ranas started emerging from among the few educated people, who had studied in various South Asian schools and colleges, and also from within the Ranas, many of whom were marginalized within the ruling Rana hierarchy. Many of these Nepalese in exile had actively taken part in the Indian Independence struggle and wanted to liberate Nepal as well from the autocratic Rana occupation. The political parties such as the Praja Parishad and Nepali Congress were already formed in exile by leaders such as B. P. Koirala, Ganesh Man Singh, Subarna Sumsher Rana, Krishna Prasad Bhattarai, Girija Prasad Koirala, and many other patriotic-minded Nepalis who urged the military and popular political movement in Nepal to overthrow the autocratic Rana regime. The Nepali Congress also formed a military wing Nepali Congress's Liberation Army. Among the prominent martyrs to die for the cause, executed at the hands of the Ranas, were Dharma Bhakta Mathema, Shukraraj Shastri, Gangalal Shrestha, and Dasharath Chand who were the members of the Praja Parisad. This turmoil culminated in King Tribhuvan, a direct descendant of Prithvi Narayan Shah, fleeing from his "palace prison" in 1950, to India, touching off an armed revolt against the Rana administration. This eventually ended in the return of the Shah family to power and the appointment of a non-Rana as prime minister following a tri-partite agreement signed called 'Delhi Compromise'. A period of quasi-constitutional rule followed, during which the monarch, assisted by the leaders of fledgling political parties, governed the country. During the 1950s, efforts were made to frame a constitution for Nepal that would establish a representative form of government, based on a British model. A 10-member cabinet under Prime Minister Mohan Shumsher with 5 members of the Rana family and 5 of the Nepali Congress was formed. This government drafted a constitution called the 'Interim Government Act' which was the first constitution of Nepal. But this government failed to work in consensus as the Ranas and Congressmen were never on good terms. So, on 16 November 1951, the king formed a new government of 14 ministers under Matrika Prasad Koirala, which was later dissolved. Panchayat system The first democratic elections were held in 1959, and B. P. Koirala was elected prime minister. But |
the Kali Gandaki Gorge where Thakali culture shows influences in both directions. Permanent villages in the mountain region stand as high as with summer encampments even higher. Bhotiyas graze yaks, grow cold-tolerant crops such as potatoes, barley, buckwheat and millet. They traditionally traded across the mountains, e.g., Tibetan salt for rice from lowlands in Nepal and India. Since trade was restricted in the 1950s they have found work as high altitude porters, guides, cooks and other accessories to tourism and alpinism. Hilly Hilly Region is a mountain region which does not generally contain snow. It is situated to the south of the Himal Region (the snowy mountain region). This region begins at the Lower Himalayan Range, where a fault system called the Main Boundary Thrust creates an escarpment high, to a crest between . It covers 68% of the total area of Nepal. These steep southern slopes are nearly uninhabited, thus an effective buffer between languages and culture in the Terai and Hilly. Paharis mainly populate river and stream bottoms that enable rice cultivation and are warm enough for winter/spring crops of wheat and potato. The increasingly urbanized Kathmandu and Pokhara valleys fall within the Hill region. Newars are an indigenous ethnic group with their own Tibeto-Burman language. The Newar were originally indigenous to the Kathmandu valley but have spread into Pokhara and other towns alongside urbanized Pahari. Other indigenous Janajati ethnic groups -— natively speaking highly localized Tibeto-Burman languages and dialects -— populate hillsides up to about . This group includes Magar and Kham Magar west of Pokhara, Gurung south of the Annapurnas, Tamang around the periphery of Kathmandu Valley and Rai, Koinch Sunuwar and Limbu further east. Temperate and subtropical fruits are grown as cash crops. Marijuana was grown and processed into Charas (hashish) until international pressure persuaded the government to outlaw it in 1976. There is increasing reliance on animal husbandry with elevation, using land above for summer grazing and moving herds to lower elevations in winter. Grain production has not kept pace with population growth at elevations above where colder temperatures inhibit double cropping. Food deficits drive emigration out of the Pahad in search of employment. The Hilly ends where ridges begin substantially rising out of the temperate climate zone into subalpine zone above . Terai Terai is a low land region containing some hill ranges. Looking out for its coverage, it covers 17% of the total area of Nepal. The Terai (also spelt Tarai) region begins at the Indian border and includes the southernmost part of the flat, intensively farmed Gangetic Plain called the Outer Terai. By the 19th century, timber and other resources were being exported to India. Industrialization based on agricultural products such as jute began in the 1930s and infrastructure such as roadways, railways and electricity were extended across the border before it reached Nepal's Pahad region. The Outer Terai is culturally more similar to adjacent parts of India's Bihar and Uttar Pradesh than to the Pahad of Nepal. Nepali is taught in schools and often spoken in government offices, however, the local population mostly uses Maithali, Bhojpuri and Tharu languages. The Outer Terai ends at the base of the first range of foothills called the Siwaliks or Churia. This range has a densely forested skirt of coarse alluvium called the Bhabhar. Below the Bhabhar, finer, less permeable sediments force groundwater to the surface in a zone of springs and marshes. In Persian, terai refers to wet or marshy ground. Before the use of DDT this was dangerously malarial. Nepal's rulers used this for a defensive frontier called the char kose jhadi (four kos forest, one kos equaling about three kilometers or two miles). Above the Bhabhar belt, the Siwaliks rise to about with peaks as high as , steeper on their southern flanks because of faults are known as the Main Frontal Thrust. This range is composed of poorly consolidated, coarse sediments that do not retain water or support soil development so there is virtually no agricultural potential and sparse population. In several places beyond the Siwaliks, there are dūn valleys called Inner Terai. These valleys have productive soil but were dangerously malarial except to indigenous Tharu people who had genetic resistance. In the mid-1950s DDT came into use to suppress mosquitos and the way was open to settlement from the land-poor hills, to the detriment of the Tharu. The Terai ends and the Pahad begin at a higher range of foothills called the Lower Himalayan Range. Climate Altitudinal belts Nepal's latitude is about the same as that of the United States state of Florida, however with elevations ranging from less than to over and precipitation from to over the country has eight climate zones from tropical to perpetual snow. The tropical zone below experiences frost less than once per decade. It can be subdivided into lower tropical (below 300 meters or 1,000 ft.) with 18% of the nation's land area) and upper (18% of land area) tropical zones. The best mangoes and well as papaya and banana are largely confined to the lower zone. Other fruit such as litchee, jackfruit, citrus and mangoes of lower quality grow in the upper tropical zone as well. Winter crops include grains and vegetables typically grown in temperate climates. The Outer Terai is virtually all in the lower tropical zone. Inner Terai valleys span both tropical zones. The Sivalik Hills are mostly upper tropical. Tropical climate zones extend far upriver valleys across the Middle Hills and even into the Mountain regions. The subtropical climate zone from occupies 22% of Nepal's land area and is the most prevalent climate of the Middle Hills above river valleys. It experiences frost up to 53 days per year, however, this varies greatly with elevation, proximity to high mountains and terrain either draining or ponding cold air drainage. Crops include rice, maize, millet, wheat, potato, stone fruits and citrus. The great majority of Nepal's population occupies the tropical and subtropical climate zones. In the Middle Hills, "upper-caste" Hindus are concentrated in tropical valleys which are well suited for rice cultivation while Janajati ethnic groups mostly live above in the subtropical zone and grow other grains more than rice. The Temperate climate zone from occupies 12% of Nepal's land area and has up to 153 annual days of frost. It is encountered in higher parts of the Middle Hills and throughout much of the Mountain region. Crops include cold-tolerant rice, maize, wheat, barley, potato, apple, walnut, peach, various cole, amaranthus and buckwheat. The Subalpine zone from occupies 9% of Nepal's land area, mainly in the Mountain and Himalayan regions. It has permanent settlements in the Himalaya, but further south it is only seasonally occupied as pasture for sheep, goats, yak and hybrids in warmer months. There are up to 229 annual days of frost here. Crops include barley, potato, cabbage, cauliflower, amaranthus, buckwheat and apple. Medicinal plants are also gathered. The Alpine zone from occupies 8% of the country's land area. There are a few permanent settlements above 4,000 meters. There is virtually no plant cultivation although medicinal herbs are gathered. Sheep, goats, yaks and hybrids are pastured in warmer months. Above 5,000 meters the climate becomes Nival and there is no human habitation or even seasonal use. Arid and semi-arid land in the rainshadow of high ranges have a Transhimalayan climate. Population density is very low. Cultivation and husbandry conform to subalpine and alpine patterns but depend on snowmelt and streams for irrigation. Precipitation generally decreases from east to west with increasing distance from the Bay of Bengal, source of the summer monsoon. Eastern Nepal gets about annually; the Kathmandu area about and western Nepal about . This pattern is modified by adiabatic effects as rising air masses cool and drop their moisture content on windward slopes, then warm up as they descend so relative humidity drops. Annual precipitation reaches on windward slopes in the Annapurna Himalaya beyond a relatively low stretch of the Lower Himalayan Range. In rainshadows beyond the high mountains, annual precipitation drops as low as . Seasons The year is divided into a wet season from June to September—as summer warmth over Inner Asia creates a low-pressure zone that draws in moist air from the Indian Ocean—and a dry season from October to June as cold temperatures in the vast interior create a high-pressure zone causing dry air to flow outward. April and May are months of intense water stress when cumulative effects of the long dry season are exacerbated by temperatures rising over in the tropical climate belt. Seasonal drought further intensifies in the Siwaliks hills consisting of poorly consolidated, coarse, permeable sediments that do not retain water, so hillsides are often covered with drought-tolerant scrub forest. In fact, much of Nepal's native vegetation adapted to withstand drought, but less so at higher elevations where cooler temperatures mean less water stress. The summer monsoon may be preceded by a buildup of thunderstorm activity that provides water for rice seedbeds. Sustained rain on average arrives in mid-June as rising temperatures over Inner Asia creates a low-pressure zone that draws in moist air from the Indian Ocean, but this can vary up to a month. Significant failure of monsoon rains historically meant drought and famine while above-normal rains still cause flooding and landslides with losses in human lives, farmland and buildings. The monsoon also complicates transportation with roads and trails washing out while unpaved roads and airstrips may become unusable and cloud cover reduces safety margins for aviation. Rains diminish in September and generally end by mid-October, ushering in generally cool, clear, and dry weather, as well as the most relaxed and jovial period in Nepal. By this time, the harvest is completed and people are in a festive mood. The two largest and most important Hindu festivals—Dashain and Tihar (Dipawali)—arrive during this period, about one month apart. The post-monsoon season lasts until about December. After the post-monsoon comes the winter monsoon, a strong northeasterly flow marked by occasional, short rainfalls in the lowlands and plains and snowfalls in the high-altitude areas. In this season the Himalayas function as a barrier to cold air masses from Inner Asia, so southern Nepal and northern India have warmer winters than would otherwise be the case. April and May are dry and hot, especially below where afternoon temperatures may exceed . Environment The dramatic changes in elevation along this transect result in a variety of biomes, from tropical savannas along the Indian border, to subtropical broadleaf and coniferous forests in the hills, to temperate broadleaf and coniferous forests on the slopes of the Himalaya, to montane grasslands and shrublands, and finally rock and ice at the highest elevations. This corresponds to the Terai-Duar savanna and grasslands ecoregion. Subtropical forests dominate the lower elevations of the Hill region. They form a mosaic running east–west across Nepal, with Himalayan subtropical broadleaf forests between and Himalayan subtropical pine forests between . At higher elevations, to , are found temperate broadleaf forests: eastern Himalayan broadleaf forests to the east of the Gandaki River and western Himalayan broadleaf forests to the west. The native forests of the Mountain region change from east to west as precipitation decreases. They can be broadly classified by their relation to the Gandaki River. From are the eastern and western Himalayan subalpine conifer forests. To are the eastern and western Himalayan alpine shrub and meadows. Environmental issues Natural hazards Earthquakes, severe thunderstorms (tornadoes are rare), flooding and flash flooding, landslides, drought, and famine depending on the timing, intensity, and duration of the summer monsoons Environment - current issues Deforestation (overuse of wood for fuel and lack of alternatives); contaminated water (with human and animal wastes, agricultural runoff, and industrial effluents); wildlife conservation; vehicular emissions Environment - international agreements Party to: Biodiversity, Climate Change, Climate Change-Kyoto Protocol, Desertification, Endangered Species, Hazardous Wastes, Law of the Sea, Ozone Layer Protection, Tropical Timber 83, Tropical Timber 94, Wetlands Signed, but not ratified: Marine Life Conservation Existing and proposed dams, barrages and canals for flood control, irrigation and hydroelectric generation River systems Nepal has three categories of rivers. The largest systems -— from east to west the Koshi, Gandaki/Narayani, Karnali/Goghra and Mahakali—originate in multiple tributaries rising in or beyond | summer monsoons Environment - current issues Deforestation (overuse of wood for fuel and lack of alternatives); contaminated water (with human and animal wastes, agricultural runoff, and industrial effluents); wildlife conservation; vehicular emissions Environment - international agreements Party to: Biodiversity, Climate Change, Climate Change-Kyoto Protocol, Desertification, Endangered Species, Hazardous Wastes, Law of the Sea, Ozone Layer Protection, Tropical Timber 83, Tropical Timber 94, Wetlands Signed, but not ratified: Marine Life Conservation Existing and proposed dams, barrages and canals for flood control, irrigation and hydroelectric generation River systems Nepal has three categories of rivers. The largest systems -— from east to west the Koshi, Gandaki/Narayani, Karnali/Goghra and Mahakali—originate in multiple tributaries rising in or beyond the high Himalaya that maintain substantial flows from snowmelt through the hot, drought-stricken spring before the summer monsoon. These tributaries cross the highest mountains in deep gorges, flow south through the Middle Hills, then join in candelabra-like configuration before crossing the Lower Himalayan Range and emerging onto the plains where they have deposited megafans exceeding in area. The Koshi is also called Sapta Koshi for its seven Himalayan tributaries in eastern Nepal: Indrawati, Sun Koshi, Tama Koshi, Dudh Koshi, Liku, Arun, and Tamor. The Arun rises in Tibet some beyond Nepal's northern border. A tributary of the Sun Koshi, Bhote Koshi also rises in Tibet and is followed by the Arniko Highway connecting Kathmandu and Lhasa. The Gandaki/Narayani has seven Himalayan tributaries in the center of the country: Daraundi, Seti Gandaki, Madi, Kali, Marsyandi, Budhi, and Trisuli also called Sapta Gandaki. The Kali Gandaki rises on the edge of the Tibetan Plateau and flows through the semi-independent Kingdom of Mustang, then between the 8,000 meter Dhaulagiri and Annapurna ranges in the world's deepest valley. The Trisuli rises north of the international border inside Tibet. After the seven upper tributaries join, the river becomes the Narayani inside Nepal and is joined by the East Rapti from Chitwan Valley. Crossing into India, its name changes to Gandak. The Karnali drains western Nepal, with the Bheri and Seti as major tributaries. The upper Bheri drains Dolpo, a remote valley beyond the Dhaulagiri Himalaya with traditional Tibetan cultural affinities. The upper Karnali rises inside Tibet near-sacred Lake Manasarovar and Mount Kailash. The area around these features is the hydrographic nexus of South Asia since it holds the sources of the Indus and its major tributary the Sutlej, the Karnali—a Ganges tributary—and the Yarlung Tsangpo/Brahmaputra. It is the centre of the universe according to traditional cosmography. The Mahakali or Kali along the Nepal-India border on the west joins the Karnali in India, where the river is known as Goghra or Ghaghara. Second category rivers rise in the Middle Hills and Lower Himalayan Range, from east to west the Mechi, Kankai and Kamala south of the Kosi; the Bagmati that drains Kathmandu Valley between the Kosi and Gandaki systems, then the West Rapti and the Babai between the Gandaki and Karnali systems. Without glacial sources, annual flow regimes in these rivers are more variable although limited flow persists through the dry season. Third category rivers rise in the outermost Siwalik foothills and are mostly seasonal. None of these river systems supports significant commercial navigation. Instead, deep gorges create obstacles to establishing transport and communication networks and de-fragmenting the economy. Foot-trails are still the primary transportation routes in many hill districts. River management Rivers in all three categories are capable of causing serious floods. Koshi River in the first category caused a major flood in August 2008 in Bihar state, India after breaking through a poorly maintained embankment just inside Nepal. The West Rapti in the second category is called "Gorakhpur's Sorrow" for its history of urban flooding. Third category Terai rivers are associated with flash floods. Since uplift and erosion are more or less in equilibrium in the Himalaya, at least where the climate is humid, rapid uplift must be balanced out by annual increments of millions tonnes of sediments washing down from the mountains; then on the plains settling out of suspension on vast alluvial fans over which rivers meander and change course at least every few decades, causing some experts to question whether manmade embankments can contain the problem of flooding. Traditional Mithila culture along the lower Koshi in Nepal and Bihar celebrated the river as the giver of life for its fertile alluvial soil, yet also the taker of life through its catastrophic floods. Large reservoirs in the Middle Hills may be able to capture peak flows and mitigate downstream flooding, to store surplus monsoon flows for dry season irrigation and to generate electricity. Water for irrigation is especially compelling because the Indian Terai is suspected to have entered a food bubble where dry season crops are dependent on water from tube wells that in the aggregate are unsustainably "mining" groundwater. Depletion of aquifers without building upstream dams as a sustainable alternative water source could precipitate a Malthusian catastrophe in India's food insecure states Uttar Pradesh and Bihar, with over 300 million combined population. With India already experiencing a Naxalite–Maoist insurgency in Bihar, Jharkhand and Andhra Pradesh, Nepalese reluctance to agree to water projects could even seem an existential threat to India. As Nepal builds barrages to divert more water for irrigation during the dry season preceding the summer monsoon, there is less for downstream users in Bangladesh and India's Bihar and Uttar Pradesh states. The best solution could be building large upstream reservoirs, to capture and store surplus flows during the summer monsoon as well as providing flood control benefits to Bangladesh and India. Then water-sharing agreements could allocate a portion of the stored water to be left to flow into India during the following dry season. Nevertheless, building dams in Nepal is controversial for several reasons. First, the region is seismically active. Dam failures caused by earthquakes could cause tremendous death and destruction downstream, particularly on the densely populated Gangetic Plain. Second, global warming has led to the formation of glacial lakes dammed by unstable moraines. Sudden failures of these moraines can cause floods with cascading failures of manmade structures downstream. Third, sedimentation rates in the Himalaya are extremely high, leading to rapid loss of storage capacity as sediments accumulate behind dams. Fourth, there are complicated questions of cross-border equity in how India and Nepal would share costs and benefits that have proven difficult to resolve in the context of frequent acrimony between the two countries.<ref>{{cite journal |title=Malhotra, op. cit.|url=http://www.ipcs.org/pdf_file/issue/SR95.pdf}}</ref> Area Total: Land: Water: Coastline 0 km (landlocked) Elevation extremes Lowest point: Kechana Kawal, jhapa district 59 m Highest point: Sagarmatha (Mount Everest) 8,848 m Resources and land use Natural resources Quartz, water, timber, hydropower, scenic beauty, small deposits of lignite, copper, cobalt, iron ore Land use Arable land: 16.0% Permanent crops: 0.8% Other: 83.2% (2001) Irrigated land 11,680 km² (2003) Nearly 50% of arable land Total renewable water resources 210.2 km3 (2011) Land cover ICIMOD’s first and most complete national land cover database of Nepal prepared using public domain Landsat TM data of 2010 shows that show that forest is the dominant form of land cover in Nepal covering 57,538 km2 with a contribution of 39.09% to the total geographical area of the country. Most of this forest cover is broadleaved closed and open forest, which covers 21,200 km2 or 14.4% of the geographical area. Needleleaved open forest is the least common of the forest areas covering 8267 km2 (5.62%). Agriculture area is significant extending over 43,910 km2 (29.83%). As would be expected, the high mountain area is largely covered by snow and glaciers and barren land. The Hill region constitutes the largest portion of Nepal, covering 29.5% of the geographical area, and has a large area (19,783 km2) of cultivated or managed lands, natural and semi natural vegetation (22,621 km2) and artificial surfaces (200 km2). The Tarai region has more cultivated or managed land (14,104 km2) and comparatively less natural and semi natural vegetation (4280 km2). The Tarai has only 267 km2 of natural water bodies. The High mountain region has 12,062 km2 of natural water bodies, snow/glaciers and 13,105 km2 barren areas. Forests 25.4% of Nepal's land area, or about is covered with forest according to FAO figures from 2005. FAO estimates that around 9.6% of Nepal's forest cover consists of primary forest which is relatively intact. About 12.1% Nepal's forest is classified as protected while about 21.4% is conserved according to FAO. About 5.1% Nepal's forests are classified as production forest. Between 2000 and 2005, Nepal lost about of forest. Nepal's 2000–2005 total deforestation rate was about 1.4% per year meaning it lost an average of of forest annually. Nepal's total deforestation rate from 1990 to 2000 was or 2.1% per year. The 2000–2005 true deforestation rate in Nepal, defined as the loss of primary forest, is −0.4% or per year. Forest is not changing in the plan land of Nepal, forest fragmenting on the "Roof of the World". According to ICIMOD figures from 2010, forest is the dominant form of land cover in Nepal covering 57,538 km2 with a contribution of 39.09% to the total geographical area of the country. Most of this forest cover is broadleaved closed and open forest, which covers 21,200 km2 or 14.4% of the geographical area. Needleleaved open forest is the least common of the forest areas covering 8,267 km2 (5.62%). At national level 64.8% area is covered by core forests of > 500 ha size and 23.8% forests belong to patch and edge category forests. The patch forest constituted 748 km2 at national level, out of which 494 km2 of patch forests are present in hill regions. Middle mountains, Siwaliks and Terai regions have more than 70% of the forest area under core forest category > 500 ha size. The edge forests constituted around 30% of forest area of High Mountain and Hill regions. Forest Resource Assessment (FRA) which was conducted between 2010 |
15-24 years: 21.86% (male 3,176,158/female 3,169,721) 25-54 years: 35.99% (male 4,707,264/female 5,740,985) 55-64 years: 6.22% (male 877,288/female 927,202) 65 years and over: 5.02% (male 723,523/female 732,620) (2016 est.) Median age total: 23.6 years male: 22.4 years female: 24.8 years (2016 est.) Population growth rate 1.24% (2016 est.) Birth rate 19.9 births/1,000 population (2016 est.) Death rate 5.7 deaths/1,000 population (2016 est.) Net migration rate 1.9 migrants/1,000 population (2016 est.) Total fertility rate 2.18 children born/woman (2016 est.) Urbanization urban population: 18.6% of total population (2015) rate of urbanization: 3.18% annual rate of change (2010-15 est.) Sex ratio at birth: 1.04 males/female 0-14 years: 1.07 males/female 15-24 years: 1 males/female 25-54 years: 0.82 males/female 55-64 years: 0.95 males/female 65 years and over: 0.86 males/female total population: 0.99 males/female (2016 est.) Languages Nepal's diverse linguistic heritage evolved from three major language groups: Indo-Aryan, Tibeto-Burman languages, and various indigenous language isolates. According to the 2001 national census, 92 different living languages are spoken in Nepal (a 93rd category was "unspecified"). Based upon the 2011 census, the major languages spoken in Nepal (percentage spoken out of the mother tongue language) includes Nepali (derived from Khas bhasa) is an Indo-Aryan language and is written in Devanagari script. Nepali was the language of the house of Gorkhas in the late 18th century and became the official, national language that serves as the lingua franca among Nepalese of different ethnolinguistic groups. Maithili language—along with regional languages Awadhi and Bhojpuri—are mother tongues spoken in the southern Terai. There has been a surge in the number and percentage of people who understand English. Majority of the urban and a significant number of the rural schools are English-medium schools. Higher education in technical, medical, scientific and engineering fields are entirely in English. Nepal Bhasa, the mother-tongue of the Newars, is widely used and spoken in and around Kathmandu Valley and in major Newar trade towns across Nepal. Other languages, particularly in the Inner Terai hill and mountain regions, are remnants of the country's pre-unification history of dozens of political entities isolated by mountains and gorges. These languages typically are limited to an area spanning about one day's walk. Beyond that distance, dialects and languages lose mutual intelligibility. However there are some major languages spoken by indigenous peoples in the region: Magar and Gurung in the west-central hills, Tamang in the east-centre and Limbu in the east. In the high Himalayas are spoken various Tibetan languages, including Bhotia. Since Nepal's unification, various indigenous languages have come under threat of extinction as the government of Nepal has marginalized their use through strict policies designed to promote Nepali as the official language. Indigenous languages which have gone extinct or are critically threatened include Byangsi, Chonkha, and Longaba. Since democracy was restored in 1990, however, the government has worked to improve the marginalization of these languages. Tribhuvan University began surveying and recording threatened languages in 2010 and the government intends to use this information to include more languages on the next Nepalese census. Religion As of the 2011 census, 81.3% of the Nepalese population was Hindu, 9.0% Buddhist, 4.4% Muslim, 3.0% Kiratist/Yumaist, 1.42% Christian, and 0.9% followed other or no religion. Nepal defines itself as a Secular nation according to Constitution of Nepal It is common for many Hindus in the country to also worship Buddhist deities simultaneously with Hindu traditions. The notion of religion in Nepal is more fluid than other countries, particularly Western countries. The Nepali people build their social networks through their religious celebrations, which are a central part to the whole of communities within the country. There is a general ideal held by the Nepalese people that there is an omnipotent, transcendental "moral order" that is sacred to Hinduism. This ideal exists along with the constant presence of chaos and disorder in the material world. In the Northwestern region of the country, this all-encompassing state of disorder in the world is synonymous with human affliction, for which the religious Shamans can alleviate. Shamans create a world of mythic time and space to restore order and balance to the world to cure the suffers. Kathmandu Valley is home to the Newars, a major ethnic group in Nepal. The city Bhaktapur is located inside of Kathmandu Valley. Bhaktapur was once an independent Hindu Kingdom. Individual homes typically have at least one shrine devoted to personal deities, with an altar displaying flowers, fruit, and oil among other offerings to the Gods. The perimeter of Kathmandu Valley is lined with shrines devoted to Hindu goddesses, whose purpose is to protect the city from chaotic events. At least one shrine can be found on the vast majority of streets in Kathmandu. The people of Nepal do not feel the need to segregate or compete based upon religion, so Hindu and Buddhist shrines are often coexisting in the same areas. The areas outside of the city are perceived to always possess some form of wild or disordered nature, so the Nepalese people inside of the city lines regularly worship the Hindu gods through public ceremonies. The Hindu god Vishnu symbolizes moral order in the Newar society. The natural human shortcomings in maintaining the godly moral order is represented by the Hindu god Shiva. Shiva is destructive and acts in greed, and he threatens the moral order. In ancient myths, Vishnu must step in to contain Shiva and restore the order. In recent times, there has been a rise in political violence, specifically Maoist violence. This increased violence, along with the widespread poverty creates times of hardship for the people of Nepal. During their struggles they find stability and peace in religion. Nepal's constitution continues long-standing legal provisions prohibiting discrimination against other religions (but also proselytization). The king was defied as the earthly manifestation of the Hindu god Vishnu. On May 19, 2006, the government faced a constitutional crisis, the House of Representatives which had been just reformed, having been previously dissolved, declared Nepal a "secular state". However, the 2001 census identified 80.6% of the population as Hindu and 10.7% as Buddhist (although many people labeled Hindu or Buddhist often practice a syncretic blend of Hinduism, Buddhism, or animist traditions), 4.2% of the population was Muslim, 3.6% of the population followed the indigenous Kirat Mundhum religion and Christianity was practiced by 0.45% of the population. Buddhist and Hindu shrines and festivals are respected and celebrated by most Nepalese. Certain animist practices of old indigenous religions survive. Ethnic and regional equity Nepali was the national language and Sanskrit became a required school subject. Children who spoke Nepali natively and who were exposed to Sanskrit had much better chances of passing the national examinations at the end of high school, which meant they had better employment prospects and could continue into higher education. Children who natively spoke local languages of the Madhesh and Hills, or Tibetan dialects prevailing in the high mountains were at a considerable disadvantage. This history of exclusion coupled with poor prospects for | language. Indigenous languages which have gone extinct or are critically threatened include Byangsi, Chonkha, and Longaba. Since democracy was restored in 1990, however, the government has worked to improve the marginalization of these languages. Tribhuvan University began surveying and recording threatened languages in 2010 and the government intends to use this information to include more languages on the next Nepalese census. Religion As of the 2011 census, 81.3% of the Nepalese population was Hindu, 9.0% Buddhist, 4.4% Muslim, 3.0% Kiratist/Yumaist, 1.42% Christian, and 0.9% followed other or no religion. Nepal defines itself as a Secular nation according to Constitution of Nepal It is common for many Hindus in the country to also worship Buddhist deities simultaneously with Hindu traditions. The notion of religion in Nepal is more fluid than other countries, particularly Western countries. The Nepali people build their social networks through their religious celebrations, which are a central part to the whole of communities within the country. There is a general ideal held by the Nepalese people that there is an omnipotent, transcendental "moral order" that is sacred to Hinduism. This ideal exists along with the constant presence of chaos and disorder in the material world. In the Northwestern region of the country, this all-encompassing state of disorder in the world is synonymous with human affliction, for which the religious Shamans can alleviate. Shamans create a world of mythic time and space to restore order and balance to the world to cure the suffers. Kathmandu Valley is home to the Newars, a major ethnic group in Nepal. The city Bhaktapur is located inside of Kathmandu Valley. Bhaktapur was once an independent Hindu Kingdom. Individual homes typically have at least one shrine devoted to personal deities, with an altar displaying flowers, fruit, and oil among other offerings to the Gods. The perimeter of Kathmandu Valley is lined with shrines devoted to Hindu goddesses, whose purpose is to protect the city from chaotic events. At least one shrine can be found on the vast majority of streets in Kathmandu. The people of Nepal do not feel the need to segregate or compete based upon religion, so Hindu and Buddhist shrines are often coexisting in the same areas. The areas outside of the city are perceived to always possess some form of wild or disordered nature, so the Nepalese people inside of the city lines regularly worship the Hindu gods through public ceremonies. The Hindu god Vishnu symbolizes moral order in the Newar society. The natural human shortcomings in maintaining the godly moral order is represented by the Hindu god Shiva. Shiva is destructive and acts in greed, and he threatens the moral order. In ancient myths, Vishnu must step in to contain Shiva and restore the order. In recent times, there has been a rise in political violence, specifically Maoist violence. This increased violence, along with the widespread poverty creates times of hardship for the people of Nepal. During their struggles they find stability and peace in religion. Nepal's constitution continues long-standing legal provisions prohibiting discrimination against other religions (but also proselytization). The king was defied as the earthly manifestation of the Hindu god Vishnu. On May 19, 2006, the government faced a constitutional crisis, the House of Representatives which had been just reformed, having been previously dissolved, declared Nepal a "secular state". However, the 2001 census identified 80.6% of the population as Hindu and 10.7% as Buddhist (although many people labeled Hindu or Buddhist often practice a syncretic blend of Hinduism, Buddhism, or animist traditions), 4.2% of the population was Muslim, 3.6% of the population followed the indigenous Kirat Mundhum religion and Christianity was practiced by 0.45% of the population. Buddhist and Hindu shrines and festivals are respected and celebrated by most Nepalese. Certain animist practices of old indigenous religions survive. Ethnic and regional equity Nepali was the national language and Sanskrit became a required school subject. Children who spoke Nepali natively and who were exposed to Sanskrit had much better chances of passing the national examinations at the end of high school, which meant they had better employment prospects and could continue into higher education. Children who natively spoke local languages of the Madhesh and Hills, or Tibetan dialects prevailing in the high mountains were at a considerable disadvantage. This history of exclusion coupled with poor prospects for improvement created grievances that encouraged many in ethnic communities such as Madhesi and Tharu in the Tharuhat and Madhesh and Kham Magar in the mid-western hills to support the Unified Communist Party of Nepal (Maoist) and various other armed Maoist opposition groups such as the JTMM during and after the Nepalese Civil War. The negotiated end to this war forced King Gyanendra to abdicate in 2008. Issues of ethnic and regional equity have tended to dominate the agenda of the new republican government and continue to be divisive. Today, even after the end of a 10-year-old Maoist conflict, the upper caste dominates every field in Nepal. Although Newars are low in numbers, their urban living habitat gives them a competitive advantage. Kayastha of Madhesh are the toppers in Human Development Index. From a gender perspective, Newari women are the most literate and lead in every sector. Brahmin and Chhetri women have experienced less social and economic mobility compared to Newari women. Specifically, Brahmin women experience less equality due to their predominately rural living conditions which deprives them of access to certain educational and healthcare advantages. Nepalese diaspora Nepalese in the U.K. In the 2001 census, approximately 6,000 Nepalese were living in the UK. According to latest figure from Office for National Statistics estimates that 51,000 Nepal-born people are currently resident in the UK. There has been increasing interest in the opportunities offered in the UK by the Nepalese, especially education. Between the years of 2001 to 2006, there were 7,500 applications for student visas. Nepalese in Hong Kong The Nepali people residing in Hong Kong are primarily made up of children of ex-Gurkhas; born in Hong Kong during their parents' service with the British Army's Brigade of Gurkhas, which was based in Hong Kong from the 1970s until the handover. Large groups of Nepali people can be found in Shek Kong and Yuen Long District off of the main bases of the British army. Many ex-Gurkhas remained in Hong Kong after the end of their service under the sponsorship of their Hong Kong-born children, who held right of abode. Nepalese of middle age or older generations in Hong Kong are predominantly found in security, while those of younger generations are predominantly found in the business industry. Mostly the people from Kirati ethnic groups such as Rai and Limbu are the ones |
2001, in which members of the royal family, King Birendra, Queen Aishwarya, Crown Prince Dipendra, Prince Nirajan, as well as many others, were killed in the massacre. However, after the massacre, the Crown Prince survived for a short while in a coma. Although the prince never regained consciousness before dying, Crown Prince Dipendra was the monarch under the law of Nepali royal succession. Two days later after his death, the late King's surviving brother Gyanendra was proclaimed as a king. 2002–2007: Suspension of parliament and Loktantra Andolan On 1 February 2002 King Gyanendra suspended the Parliament, appointed a government led by himself, and enforced martial law. The King argued that civil politicians were unfit to handle the Maoist insurgency. Telephone lines were cut and several high-profile political leaders were detained. Other opposition leaders fled to India and regrouped there. A broad coalition called the Seven Party Alliance (SPA) was formed in opposition to the royal takeover, encompassing the seven parliamentary parties who held about 90% of the seats in the old, dissolved parliament. The UN-OHCHR, in response to events in Nepal, set up a monitoring program in 2005 to assess and observe the human rights situation there On 22 November 2005, the Seven Party Alliance (SPA) of parliamentary parties and the Communist Party of Nepal (Maoist) agreed on a historic and unprecedented 12-point memorandum of understanding (MOU) for peace and democracy. Nepali people from various walks of life and the international community regarded the MOU as an appropriate political response to the crisis that was developing in Nepal. Against the backdrop of the historical sufferings of the Nepali people and the enormous human cost of the last ten years of violent conflict, the MOU, which proposes a peaceful transition through an elected constituent assembly, created an acceptable formula for a united movement for democracy. As per the 12-point MOU, the SPA called for a protest movement, and the Communist Party of Nepal (Maoist) supported it. This led to a countrywide uprising called the Loktantra Andolan that started in April 2006. All political forces including civil society and professional organisations actively galvanised the people. This resulted in massive and spontaneous demonstrations and rallies held across Nepal against King Gyanendra's autocratic rule. On 21 April 2006, King Gyanendra declared that "power would be returned to the people". This had little effect on the people, who continued to occupy the streets of Kathmandu and other towns, openly defying the daytime curfew. Finally, King Gyanendra announced the reinstatement of the House of Representatives, thereby conceding one of the major demands of the SPA, at midnight on 24 April 2006. Following this action, the coalition of political forces decided to call off the protests. At least 14 died during the 19 days of protests. On 19 May 2006, the parliament assumed total legislative power and gave executive power to the Government of Nepal (previously known as His Majesty's Government). Names of many institutions (including the army) were stripped of the "royal" adjective and the Raj Parishad (a council of the King's advisers) was abolished, with his duties assigned to the Parliament itself. The activities of the King became subject to parliamentary scrutiny and the King's properties were subjected to taxation. Moreover, Nepal was declared a secular state abrogating the previous status of a Hindu Kingdom. However, most of the changes have, as yet, not been implemented. On 19 July 2006, the prime minister, G. P. Koirala, sent a letter to the United Nations announcing the intention of the Nepali government to hold elections to a constituent assembly by April 2007. December 2007 to May 2008: Abolition of the monarchy On 23 December 2007, an agreement was made for the monarchy to be abolished and the country to become a federal republic with the Prime Minister becoming head of state. The Communist Party of Nepal (Maoist) became the largest party amidst a general atmosphere of fear and intimidation from all sides. A federal republic was established in May 2008, with only four members of the 601-seat Constituent Assembly voting against the change, which ended 240 years of royal rule in Nepal. The government announced a public holiday for three days, (28 – 30 May), to celebrate the country becoming a federal republic. Since 2008 Major parties such as the Unified Communist Party of Nepal (Maoist), Communist Party of Nepal (Unified Marxist-Leninist) (CPN UML) and the Nepali Congress agreed to write a constitution to replace the interim constitution within 2 years. The Maoists, as the largest party of the country, took power right after the elections and named Pushpa Kamal Dahal (Prachanda) as the Prime Minister of Nepal. CPN UML also joined this government, but the Nepali Congress took the part of the main opposition party. Prachanda soon fell into a dispute with the then army chief Rookmanda Katwal and decided to sack him. But the President Ram Baran Yadav, as the supreme head of military power in the country, revoked this decision and gave the army chief additional time in office. An angry Prachanda and his party quit the government, majorly citing this reason and decided to operate as the main opposition to the government headed by CPN UML and its co-partner Nepali Congress afterward. Madhav Kumar Nepal was named the Prime Minister. The Maoists demanded civilian supremacy over the army. The Maoists forced closures – commonly known as bandhs – in the country, and also declared autonomous states for almost all the ethnic groups in Nepal. In May 2012 the constitutional assembly was dissolved and another election to select the new constitutional assembly members were declared by Dr. Baburam Bhattarai. Madhes Movement (2007–2016) The Madhes Movement (Nepali: मधेस अान्दोलन) is a political movement launched by various political parties, especially those based in Madhes, for equal rights, dignity and identity of Madhesis and Tharus, Muslims and Janjati groups in Nepal. In nearly a decade, Nepal witnessed three Madhes Movements - the first Madhes Movement erupted in 2007, the second Madhes Movement in 2008 and the third Madhes Movement in 2015. About the origin of the first Madhes Movement, Journalist Amarendra Yadav writes in The Rising Nepal"When the then seven-party alliance of the mainstream political parties and the CPN-Maoist jointly announced the Interim Constitution in 2007, it totally ignored the concept of federalism, the most desired political agenda of Madhesis and other marginalised communities. A day after the promulgation of the interim statute, a group of Madhesi activists under the Upendra Yadav-led Madhesi Janaadhikar Forum-Nepal (then a socio-intellectual NGO) burnt copies of the interim constitution at Maitighar Mandala, Kathmandu." This triggered the Madhes movement I. The second Madhes Movement took place in 2008, jointly launched by Madhesi Janaadhikar Forum-Nepal, Terai Madhes Loktantrik Party and Sadbhawana Party led by Rajendra Mahato with three key agenda: federalism, proportional representation and population-based election constituency, which were later ensured in the Interim Constitution of Nepal 2008. However, The Constitution of Nepal 2015 backtracked from those issues, that were already ensured by the Interim Constitution of Nepal 2008. Supreme Court of Nepal Advocate Dipendra Jha writes in The Kathmandu Post: "many other aspects of the new constitution are more regressive than the Interim Constitution of Nepal 2007. Out of all its deficiencies, the most notable one concerns the issue proportional representation or inclusion in all organs of the state." This triggered the third Madhes Movement by Madhesis in Nepal. Although the first amendment to the constitution was done, the resistance over the document by Madhesi and Tharus in Nepal still continues. From 2017 to 2019 In June 2017, Nepali Congress leader Sher Bahadur Deuba was elected the 40th Prime Minister of Nepal, succeeding Prime Minister and Chairman of CPN (Maoist Centre) Pushpa Kamal Dahal. Deuba had been previously Prime Minister from 1995 to 1997, from 2001 to 2002, and from 2004 to 2005. In November 2017, Nepal had its first general election since the civil war ended and the monarchy was abolished. The main alternatives were centrist Nepali Congress Party and the alliance of former Maoist rebels and the Communist UML party. The alliance of communists won the election, and UML leader Khadga Prasad Sharma Oli was sworn in February 2018 as the new Prime Minister. He had previously been Prime Minister since 2015 until 2016. Political crisis 2020 Since the inception of NCP, the struggle for power between the two leaders: Khadga Prasad Sharma Oli and Pushpa Kamal Dahal started. The internal crisis led to dissolution of parliament (both house of representative and lower house of parliament) by Khadga Prasad Oli twice within six months. It was approved by the president but Supreme court denied the legality of such decision by Oli. After the supreme court's historic decision, both the parliaments were reinstated. After facing the vote for confidence in parliament, Oli lost the vote for confidence. As per the 72(6) of Constitution of Nepal, the opposition was given the opportunity to form a new government by President Bidya Devi Bhandari. The opposition and they could not construct the new government as they lacked support of one faction from Janata Samajbadi Party. As a result the party got devided and Loktantrik Samajwadi Party, Nepal was formed after nearly a month. As a result, Khadga Prasad Oli was sworn in again as the Prime minister of Nepal. He dissolved the | active and take part in election. Some of them are given below. Political conditions 2001: Royal massacre The Royal Massacre (राजदरबार हत्याकाण्ड) happened on 1 June 2001, in which members of the royal family, King Birendra, Queen Aishwarya, Crown Prince Dipendra, Prince Nirajan, as well as many others, were killed in the massacre. However, after the massacre, the Crown Prince survived for a short while in a coma. Although the prince never regained consciousness before dying, Crown Prince Dipendra was the monarch under the law of Nepali royal succession. Two days later after his death, the late King's surviving brother Gyanendra was proclaimed as a king. 2002–2007: Suspension of parliament and Loktantra Andolan On 1 February 2002 King Gyanendra suspended the Parliament, appointed a government led by himself, and enforced martial law. The King argued that civil politicians were unfit to handle the Maoist insurgency. Telephone lines were cut and several high-profile political leaders were detained. Other opposition leaders fled to India and regrouped there. A broad coalition called the Seven Party Alliance (SPA) was formed in opposition to the royal takeover, encompassing the seven parliamentary parties who held about 90% of the seats in the old, dissolved parliament. The UN-OHCHR, in response to events in Nepal, set up a monitoring program in 2005 to assess and observe the human rights situation there On 22 November 2005, the Seven Party Alliance (SPA) of parliamentary parties and the Communist Party of Nepal (Maoist) agreed on a historic and unprecedented 12-point memorandum of understanding (MOU) for peace and democracy. Nepali people from various walks of life and the international community regarded the MOU as an appropriate political response to the crisis that was developing in Nepal. Against the backdrop of the historical sufferings of the Nepali people and the enormous human cost of the last ten years of violent conflict, the MOU, which proposes a peaceful transition through an elected constituent assembly, created an acceptable formula for a united movement for democracy. As per the 12-point MOU, the SPA called for a protest movement, and the Communist Party of Nepal (Maoist) supported it. This led to a countrywide uprising called the Loktantra Andolan that started in April 2006. All political forces including civil society and professional organisations actively galvanised the people. This resulted in massive and spontaneous demonstrations and rallies held across Nepal against King Gyanendra's autocratic rule. On 21 April 2006, King Gyanendra declared that "power would be returned to the people". This had little effect on the people, who continued to occupy the streets of Kathmandu and other towns, openly defying the daytime curfew. Finally, King Gyanendra announced the reinstatement of the House of Representatives, thereby conceding one of the major demands of the SPA, at midnight on 24 April 2006. Following this action, the coalition of political forces decided to call off the protests. At least 14 died during the 19 days of protests. On 19 May 2006, the parliament assumed total legislative power and gave executive power to the Government of Nepal (previously known as His Majesty's Government). Names of many institutions (including the army) were stripped of the "royal" adjective and the Raj Parishad (a council of the King's advisers) was abolished, with his duties assigned to the Parliament itself. The activities of the King became subject to parliamentary scrutiny and the King's properties were subjected to taxation. Moreover, Nepal was declared a secular state abrogating the previous status of a Hindu Kingdom. However, most of the changes have, as yet, not been implemented. On 19 July 2006, the prime minister, G. P. Koirala, sent a letter to the United Nations announcing the intention of the Nepali government to hold elections to a constituent assembly by April 2007. December 2007 to May 2008: Abolition of the monarchy On 23 December 2007, an agreement was made for the monarchy to be abolished and the country to become a federal republic with the Prime Minister becoming head of state. The Communist Party of Nepal (Maoist) became the largest party amidst a general atmosphere of fear and intimidation from all sides. A federal republic was established in May 2008, with only four members of the 601-seat Constituent Assembly voting against the change, which ended 240 years of royal rule in Nepal. The government announced a public holiday for three days, (28 – 30 May), to celebrate the country becoming a federal republic. Since 2008 Major parties such as the Unified Communist Party of Nepal (Maoist), Communist Party of Nepal (Unified Marxist-Leninist) (CPN UML) and the Nepali Congress agreed to write a constitution to replace the interim constitution within 2 years. The Maoists, as the largest party of the country, took power right after the elections and named Pushpa Kamal Dahal (Prachanda) as the Prime Minister of Nepal. CPN UML also joined this government, but the Nepali Congress took the part of the main opposition party. Prachanda soon fell into a dispute with the then army chief Rookmanda Katwal and decided to sack him. But the President Ram Baran Yadav, as the supreme head of military power in the country, revoked this decision and gave the army chief additional time in office. An angry Prachanda and his party quit the government, majorly citing this reason and decided to operate as the main opposition to the government headed by CPN UML and its co-partner Nepali Congress afterward. Madhav Kumar Nepal was named the Prime Minister. The Maoists demanded civilian supremacy over the army. The Maoists forced closures – commonly known as bandhs – in the country, and also declared autonomous states for almost all the ethnic groups in Nepal. In May 2012 the constitutional assembly was dissolved and another election to select the new constitutional assembly members were declared by Dr. Baburam Bhattarai. Madhes Movement (2007–2016) The Madhes Movement (Nepali: मधेस अान्दोलन) is a political movement launched by various political parties, especially those based in Madhes, for equal rights, dignity and identity of Madhesis and Tharus, Muslims and Janjati groups in Nepal. In nearly a decade, Nepal witnessed three Madhes Movements - the first Madhes Movement erupted in 2007, the second Madhes Movement in 2008 and the third Madhes Movement in 2015. About the origin of the first Madhes Movement, Journalist Amarendra Yadav writes in The Rising Nepal"When the then seven-party alliance of the mainstream political parties and the CPN-Maoist jointly announced the Interim Constitution in 2007, it totally ignored the concept of federalism, the most desired political agenda of Madhesis and other marginalised communities. A day after the promulgation of the interim statute, a group of Madhesi activists under the Upendra Yadav-led Madhesi Janaadhikar Forum-Nepal (then a socio-intellectual NGO) burnt copies of the interim constitution at Maitighar Mandala, Kathmandu." This triggered the Madhes movement I. The second Madhes Movement took place in 2008, jointly launched by Madhesi Janaadhikar Forum-Nepal, Terai Madhes Loktantrik Party and Sadbhawana Party led by Rajendra Mahato with three key agenda: federalism, proportional representation and population-based election constituency, which were later ensured in the Interim Constitution of Nepal 2008. However, The Constitution of Nepal 2015 backtracked from those issues, that were already ensured by the Interim Constitution of Nepal 2008. Supreme Court of Nepal Advocate Dipendra Jha writes in The Kathmandu Post: "many other aspects of the new constitution are more regressive than the Interim Constitution of Nepal 2007. Out of all its deficiencies, the most notable one concerns the issue proportional representation or inclusion in all organs of the state." This triggered the third Madhes Movement by Madhesis in Nepal. Although the first amendment to the constitution was done, the resistance over the document by Madhesi and Tharus in Nepal still continues. From 2017 to 2019 In June 2017, Nepali Congress leader Sher Bahadur Deuba was elected the 40th Prime Minister of Nepal, succeeding Prime Minister and Chairman of CPN (Maoist Centre) Pushpa Kamal Dahal. Deuba had been previously Prime Minister from 1995 to 1997, from 2001 to 2002, and from 2004 to 2005. In November 2017, Nepal had its first general election since the civil war ended and the monarchy was abolished. The main alternatives were centrist Nepali Congress Party and the alliance of former Maoist rebels and the Communist UML party. The alliance of communists won the election, and UML leader Khadga Prasad Sharma Oli was sworn in February 2018 as the new Prime Minister. He had previously been Prime Minister since 2015 until 2016. Political crisis 2020 Since the inception of NCP, the struggle for power between the two leaders: Khadga Prasad Sharma Oli and Pushpa Kamal Dahal started. The internal crisis led to dissolution of parliament (both house of representative and lower house of parliament) by Khadga Prasad Oli twice within six months. It was approved by the president but Supreme court denied the legality of such decision by Oli. After the supreme court's historic decision, both the parliaments were reinstated. After facing the vote for confidence in parliament, Oli lost the vote for confidence. As per the 72(6) of Constitution of Nepal, the opposition was given the opportunity to form a new government by President Bidya Devi Bhandari. The opposition and they could not construct the new government as they lacked support of one faction from Janata Samajbadi Party. As a result the party got devided and Loktantrik Samajwadi Party, |
Together, they account for approximately 70% of the country's merchandise exports. The Cost of Living Index in Nepal is comparatively lower than many countries but not the least. The quality of life has declined to much less desirous value in recent years. Nepal was ranked 54th worst of 81 ranked countries (those with GHI > 5.0) on the Global Hunger Index in 2011, between Cambodia and Togo. Nepal's current score of 19.5 is better than in 2010 (20.0) and much improved than its score of 27.5 in 1990. Foreign investments and taxation Huge numbers of Small Foreign Investments come to Nepal via the Non Resident Nepali, who are investing in many sectors. Nepal has a huge potential for hydroelectricity. Accordingly, a large number of foreign companies are willing to invest in Nepal, but political instability has stopped the process. Nepal has entered into agreements for avoidance of double taxation (all in credit method) with 10 countries (PSRD) since 2000. Similarly, it has Investment protection agreements with 5 countries (PSRD) since 1983. In 2014, Nepal restricted the Foreign aid by setting a minimum limit for foreign grants, soft and commercial loans from its development partners. Imports and exports Nepal's merchandise trade balance has improved somewhat since 2000 with the growth of the carpet and garment industries. In the fiscal year 2000–2001, exports posted a greater increase (14%) than imports (4.5%), helping bring the trade deficit down by 4% from the previous year to $749 million. Recently, the European Union has become the largest buyer of ready-made garments; fruits and vegetables (mostly apples, pears, tomatoes, various salads, peach, nectarine, potatoes, rice) from Nepal. Exports to the EU accounted for 46.13 percent of the country's garment exports. The annual monsoon rain strongly influences economic growth. From 1996 to 1999, real GDP growth averaged less than 4%. The growth rate recovered in 1999, rising to 6% before slipping slightly in 2001 to 5.5%. Strong export performance, including earnings from tourism, and external aid have helped improve the overall balance of payments and increase international reserves. Nepal receives substantial amounts of external assistance from the United Kingdom, the United States, Japan, Germany, and the Scandinavian countries. Several multilateral organisations such as the World Bank, the Asian Development Bank, and the UN Development Programme also provide assistance. In June 1998, Nepal submitted its memorandum on a foreign trade regime to the World Trade Organization and in May 2000 began direct negotiations on its accession. Resources Progress has been made in exploiting Nepal's natural resources, tourism and | fuel, and fodder and contributing to erosion and flooding. Although steep mountain terrain makes exploitation difficult, mineral surveys have found small deposits of limestone, magnesite, zinc, copper, iron, mica, lead, and cobalt. The development of hydroelectric power projects also cause some tension with local indigenous groups, recently empowered by Nepal's ratification of ILO Convention 169. Macro-economic trend This is a chart of trend of gross domestic product of Nepal at market prices estimated by the International Monetary Fund and EconStats with figures in millions of Nepali Rupees. The following table shows the main economic indicators in 1980–2018. Statistics GDP: purchasing power parity - $84.37 Billion (2018 est.) GDP - real growth rate: 21.77% (2017) GDP - per capita: purchasing power parity (current international $) - $2700 (2017 est.) GDP - composition by sector: agriculture: 17% industry: 13.5% services: 60.5% (2017 est.) tourism: 9% Population below poverty line: 25.6% (2017/2018) Household income or consumption by percentage share: lowest 10%: 3.2% highest 10%: 29.8% (1995–96) Inflation rate (consumer prices): 4.5% (2017) Labour force: 4 million (2016 est.) Labor force - by occupation: agriculture 19%, services 69%, industry 12% (2014 est.) Unemployment rate: 1.47% (2017 est.) Budget: revenues: $5.954 billion expenditures: $5.974 billion, including capital expenditures of $NA (2017 est.) Industries: tourism, carpet, textile; small rice, jute, sugar, and oilseed mills; cigarette; cement and brick production Industrial production growth rate: 10.9% (2017 est.): Electricity - production: 41,083 GWh (2017) Electricity - production by source: fossil fuel: 7.5% hydro: 91.5% nuclear: 0.3% other: 0.7% (2001) Available energy:6257.73 GWh (2017) NEA Hydro:2290.78 GWh (2014) NEA Thermal:9.56 GWh (2014) purchase (total):2331.17 GWh (2014) India (purchase):2175.04 GWh (2017) Nepal (IPP):1258.94 GWh (2014) Electricity - consumption: 4,776.53 GWh (2017) Electricity - exports: 856 GWh (2001) Electricity - imports: 12 GWh (2001) Oil - production: (2001 est.) Oil - consumption: 2001 Agriculture - products: Fruits and vegetables, mostly: apples, pears, tomatoes, peaches, nectarines, potatoes, rice, maize, wheat, sugarcane, root crops, milk, and buffalo meat. Exports: $1.34 billion f.o.b., but does not include unrecorded border trade with India (2017 est.) Exports - commodities: carpets, clothing, leather goods, jute goods, grain Exports - partners: India 56.6%, US 11.5%, Turkey 4% (2016 est.) Imports: $1.03 billion f.o.b. (2017 est.) Imports - commodities: gold, machinery and equipment, petroleum products, electrical goods, medicine Imports - partners: India 70.1%, China 10.3%, UAE 2.6%, Singapore 2.1%, |
Transmission link to Digital transmission link 1995 Installation of Optical Fiber Network 1987 Commencement of STD service 1984 Reliable Rural Telecom Service (JICA) 1984 Commencement of STD service 1983 Establishment of digital Telephone Exchange 1982 Establishment of SPC telex exchange 1982 Establishment of Standard "B" Type Earth Station for international circuits 1974 Microwave transmission links establishment for internal trunk 1971 Introduction of Telex Services 1965 First Automatic exchange in Nepal (1000 lines in Kathmandu) 1964 Beginning of International Telecommunications Service using HF Radio to India and Pakistan 1962 First Public Telephone Exchange in Kathmandu (300 lines CB) 1955 Distribution of telephone line to the general public 1951 Installation of Open Wire Trunk line from Kathmandu to Palpa 1950 Establishment of CB telephone exchange (100 lines) in Kathmandu 1950 Introduction to High-frequency Radio System (AM) 1950 Establishment of Telegram Service 1936 Installation of Open Wire Trunk line from Kathmandu to Dhankuta 1935 Installation of 25 lines automatic exchange in Royal Palace 1914 Establishment of Open wire Trunk Link from Kathmandu to Raxaul (India) 1913 Establishment of first telephone lines in Kathmandu The first telephone exchange was established in Kathmandu in 1960. From 1960 to 2004, the state-owned Nepal Telecommunications Corporation (NTC), also known as Nepal Telecom, or Nepal Doorsanchar Company Limited (NDCL), had been the monopoly telecom carrier. Now, other competing telecom service providers are United Telecom (UTL) and Ncell. Telephones- PSTN: 644,347 (May 2013), CDMA Telephones Cellular service available by Nepal Telecom, formerly known as Nepal Telecommunications Corporation (NTC) and Spice Nepal (Pvt.) Ltd. Telephone system: Good telephone and telegraph service; fair radiotelephone communication service and mobile cellular telephone network Domestic: Microwave + Optical Fiber International Radiotelephone Communications; microwave landline to India; satellite earth station - 2 Intelsat (Indian Ocean) Telecom Operators Currently has three major telecom operators in Nepal: Nepal Telecom, Ncell, and SmartCell. Besides, the most anticipated CG Telecom is set to launch its services this year. Net neutrality Development of Broadcasting: Mobile Subscribers: 18,137,771 (May 2013) Radio ''Radio broadcast stations: AM 6, FM 20, shortwave 1 (January 2000)Radios: 20,00,000 (2006) Television broadcastingTelevision broadcast stations: 19 (37 registered) (2012)Televisions: 130,000 (1997) InternetRegistered Internet Service Providers (ISPs): 127 (Jan 2020)Internet users: 6,685,427 (May 2013)Country code : 00977 Internet sites: Some Top ISP in Nepal List of internet service providers in Nepal 4G connectivity in Nepal 5G connectivity in Nepal Nepal Telecom has recently announced to launch 5G in the next few years after the deployment of 4G nationwide. See also Nepal Telecommunications Authority List of Telecom Companies in Nepal List of countries by | with great pride and a sense of accomplishment, Nepal Telecommunication Corporation was transformed into Nepal Doorsanchar Company Limited (NDCL) from Baisakh 1, 2061. NDCL is a company registered under the Companies Act 2053 with an 85% government share. However, the company is known to the general public by Nepal Telecom (NT) as a registered trademark. Further developments and milestones Some milestones: 2017 Cellular 4G LTE starts in Nepal 2011 Launching of GSM 3G Data Only Service 2011 Launching of EasyPhone SIP PPP Service 2010 Launching of EasyPhone SIP EasyCall Service 2010 Soft Launch of EasyPhone IP Call Service 2010 EVDO Service started 2009 Postpaid CDMA Mobile Service started 2009 SMS Service from GSM to CDMA mobile started 2009 IVR 1606 Service extended outside Kathmandu Valley 2009 IVR 198 Service extended outside KTM valley 2008 PSTN VMS - Notice Board Service Launched 2008 IVR 198 service extended for ADSL Fault Complaint Registration 2008 IVR Service 1607 started for GSM and CDMA PUK Enquiry 2008 Broadband ADSL Service launched 2007 GPRS, 3G and CRBT Services introduced in GSM Mobile 2007 VOIP Call Complaint Registration started via 188 IVR Service 2007 PSTN Bill Enquiry Service started via 1606 IVR Service 2007 Expansion of Internet Bandwidth via Optical link between Nepal & India 2007 National Roaming for CDMA Mobile (Sky Phone) started 2006 CDMA Limited Servies in Kathmandu Valley 2006 MCC (198) Complaint Registration via IVR in Kathmandu Valley 2006 Home Country Direct Service - NepalDirect (IN) 2006 PSTN Credit Limit Service - PCL (IN) 2005 Outsourcing of Enquiry Service (197) 2005 Access Network Services 2005 Soft launch of CDMA 2004 Pre-paid Calling Card Service (IN Services) 2004 NEPAL TELECOM (Transformation from Corporation to Nepal Doorsanchar Company Limited)2003 GSM Prepaid Service 2002 East-West Highway Optical Fiber Project 2001 Launching of Payphone Service 2000 Launching of Internet Service 2000 Implementation of SDH Microwave Radio 1999 Launching of GSM Mobile service 1998 Direct Link with Bangladesh 1997 Digital Link with D.O.T. India through Optical Fiber in Birgunj - Raxual 1996 Introduction of VSAT services 1996 Independent Int. Gateway Exchange established 1996 Automation of the entire Telephone Network 1996 Conversion of all Transmission link to Digital transmission link 1995 Installation of Optical Fiber Network 1987 Commencement of STD service 1984 Reliable Rural Telecom Service (JICA) 1984 Commencement of STD service 1983 Establishment of digital Telephone Exchange 1982 Establishment of SPC telex exchange 1982 Establishment of Standard "B" Type Earth Station for international circuits 1974 Microwave transmission links establishment for internal trunk 1971 Introduction of Telex Services 1965 First Automatic exchange in Nepal (1000 lines in Kathmandu) 1964 Beginning of International Telecommunications Service using HF Radio to India and Pakistan 1962 First Public Telephone Exchange in Kathmandu (300 lines CB) 1955 Distribution of telephone line to the general public 1951 Installation of Open Wire Trunk line from Kathmandu to Palpa 1950 Establishment of CB telephone exchange (100 lines) in Kathmandu 1950 Introduction to High-frequency Radio System (AM) 1950 Establishment of Telegram Service 1936 Installation of Open Wire Trunk line from Kathmandu to Dhankuta 1935 Installation of |
Raxaul–Sirsiya and the Jainagar–Janakpur. The former is a line from Raxaul, India to Sirsiya Inland Container Depot (or dry port) near Birganj, Nepal, and is primarily used for freight transport. It allows container traffic to be imported to Nepal through the Sirsiya dry port container depot. The latter is a line from Jaynagar, India to Janakpur, Nepal, and is used primarily for passenger transport. Nepal and India had agreed to construct railway line linking Raxaul with Kathmandu during Prime Minister KP Oli's visit to India. A team of technical officers visited Kathmandu to study the proposed railway from Raxaul to Kathmandu and they have stated that a feasibility study of the project would begin. They have already identified Chobhar as the terminus of the 113 km-long line. A line through Kathmandu, linking India with Lhasa in Tibet, has been | terrain. Road Road is the country's primary transport mode. The Economic Survey 2014-15 released by the Ministry of Finance (Nepal), shows that the country has a total road network of , which includes of roads constructed and being maintained by the Department of Roads(DoR) and of roads constructed by local government bodies. Highways Total: 31393 km Paved: 14102 km Gravel:. 7881 km Unpaved: 9410 km (2018 est.) Rail There are two railway lines in the country: the Raxaul–Sirsiya and the Jainagar–Janakpur. The former is a line from Raxaul, India to Sirsiya Inland Container Depot (or dry port) near Birganj, Nepal, and is primarily used for freight transport. It allows container traffic to be imported to Nepal through the Sirsiya dry port container depot. The latter is a line from Jaynagar, India to Janakpur, Nepal, and is used primarily for passenger transport. Nepal and India had agreed to construct railway line linking Raxaul with Kathmandu during Prime Minister KP Oli's visit to India. A team of technical officers visited Kathmandu to study the proposed railway from Raxaul to Kathmandu and they have stated that |
treaty was signed on 2 June 1789 in Kerung. The treaty is called the ‘Treaty of Kerung’ by historians Rasuwa Gadhi and Timure were the firm bases in the first Nepal-Tibet war. Syabru Besi and Rasuwa Gadhi were Strategic points in this war. Likewise, Listi and Duguna villages were the main bases for offensive operations against Tibet. They were the forward most dumping places of the Royal Nepalese Army. Although Rasuwa Gadhi and Duguna Gadhi Fortresses were not constructed at the time, the places themselves were important because of their military significance. Nepal-Tibet/China War Nepalese-Tibetan War Foreign Involvements Royal Nepal Army in Indian Sepoy Mutiny Royal Nepal Army in The First World War 1914–1918 Royal Nepal Army in Waziristhan War Royal Nepal Army in Afghan War −1919 Royal Nepal Army in The Second World War Royal Nepal Army in Hyderabad Action – 1948 Domestic Operations Disarmament of the Khampas – 1974 In 1974, The Royal Nepalese Army (RNA) was mobilized to disarm the Tibetan Khampas who had been using Nepalese soil to engage guerilla war against the Chinese forces. The Khampas had secretly created their base in Mustang (north-west Nepal) and were operating from there against China. The RNA, under immense diplomatic pressure from China and the international community, moved nine infantry units towards the Khampa post in Mustang and gave them an ultimatum to either disarm themselves and surrender, or face consequences. The terms and conditions of their surrender was that they would be given Nepalese citizenship, land, and some money. The Khampa commander Wang Di agreed to surrender but eventually fled the camp. He was later killed in Doti, far-western Nepal by RNA forces while trying to loot a Nepal Police post. This was first time that the RNA was mobilized in such a large number domestically. Nepalese Civil War International operations Nepal Army's long association with UN Peace Support Operations began with the deployment of five Military Observers in the Middle East, Lebanon (UNOGIL/ United Nations Observation Group in Lebanon) in 1958. And the first Nepalese contingent, Purano Gorakh battalion was deployed in Egypt in 1974. Nepal's participation in the UN peacekeeping operations spans a period of 50 years covering Nepal army involved UN Missions are 42,of which latest being UNSMIL in Libya, in which over sixty thousand six hundred and fifty two (60,652) Nepalese soldiers have served in support of UN peacekeeping endeavors. The Nepal Army has contributed Force Commanders, military contingent, military observers and staff officers. Nepalese troops have taken part in some of the most difficult operations, and have suffered casualties in the service of the UN. To date, the number of those lost on duty with the UN is 54, while 57 were seriously wounded. Its most significant contribution has been of peace and stability in Africa. It has demonstrated its capacity of sustaining large troop commitments over prolonged periods. Presently, Nepal is ranked as the second largest troop contributing country (TCC) to the UN. United Nations Interim Force in Lebanon (UNIFIL), UNOSOMII the UN Protective Force (UNPROFOR), UN Operational Mission Somalia II, UNMIH the United Nations Mission in Haiti. UNAMSIL – Currently, Nepal is sending an 800-man battalion to serve in the peacekeeping mission in Sierra Leone (UNAMSIL). UNMIS – The Nepalese Army has sent a protection company of 200 personnel in United Nations Mission In Sudan. RCHQ – The RCHQ, KASSALA is also manned by the Nepalese Staffs. UNMISET – The UN mission in Timor Leste (East Timor) MINUSTAH – The UN mission in Haiti UNDOF U.S./Nepal military relations The U.S.-Nepali military relationship focuses on support for democratic institutions, civilian control of the military, and the professional military ethic to include respect for human rights. Both countries have had extensive contact over the years. Nepali Army units and Nepalese Army Air Service units have served with distinction alongside American forces in places such as Haiti, Iraq, and Somalia. U.S.-Nepali military engagement continues today through IMET, Enhanced International Peacekeeping Capabilities (EIPC), and various conferences and seminars. The U.S. military sends many Nepalese Army officers to America to attend military schooling such as the Command and General Staff College and the U.S. Army War College. The IMET budget for FY2001 was $220,000. The EPIC program is an interagency program between the Department of Defense and the Department of State to increase the pool of international peacekeepers and to promote interoperability. Nepal received about $1.9 million in EPIC funding. Commander United States Pacific Command (CDRUSPACOM) coordinates military engagement with Nepal through the Office of Defense Cooperation (ODC). The ODC Nepal is located in the American Embassy, Kathmandu. PRC or India & Nepal military relations India has agreed to resume the military aid to Nepal. The aid was in the pipeline before India imposed an embargo in February 2005 following the seizure of power by the then King Gyanendra. In 2009, People's Republic of China pledged military aid worth Rs100 million to Nepal. Divisions The command of the Nepalese army is divided into 8 parts namely. FAR WESTERN DIVISION NORTH WESTERN DIVISION MID WESTERN DIVISION WESTERN DIVISION MID DIVISION VALLEY DIVISION MID EASTERN DIVISION EASTERN DIVISION Statistics Military branches: Nepalese Army (includes Nepalese Army Air Service), Armed Police Force Nepal, Nepalese Police Force Military manpower – military age: 17 years of age Military manpower – availability: males age 15–49: 6,674,014 (2003 est.) Military manpower – fit for military service: males age 15–49: 3,467,511 (2003 est.) Military manpower – reaching military age annually: males: 303,222 (2003 est.) Military expenditures – dollar figure: $57.22 million (FY02) Military expenditures – percent of GDP: 1.1% (FY02) Gorkhas Nepal is also notable for the Gorkhas. Significant sections of the British Army and Indian Army are recruited from Nepal. This arrangement comes from the days of the British East India Company's rule of India when Company troops tried to invade Nepal and were beaten back. Both sides were impressed with the other, and Gurkhas were recruited into the company's forces. The Gurkhas remained loyal during the Indian Mutiny of 1857 and were kept on in the Indian Army thereafter. Upon Indian independence in 1947, some units went to British service and some to Indian service, with a Britain-India-Nepal Tripartite Agreement signed between the three nations. The Gorkhas are feared troops, and their signature weapon is the | used to have three members: the Prime Minister, the Defence Minister and the Chief of the Army Staff. In accordance with the Constitution, the King (as Supreme Commander) used to "operate and use" the "Royal Nepal Army on the recommendation" of this council. Battles of Unification campaigns The Nepalese army fought various battles on the national unification campaigns of the 18th century. These battles of Nepal unification helped the Royal Nepalese army to gain more experience while helping to unify Nepal. Battles of Nepal Unification Campaign Engagements Battle against Mir Qasim The fortress of Makawanpur has a historical and military significance for the Nepalese. It was here that the Nepalese defeated superior forces of Mir Qasim in 1763 and seized 500 guns and two cannons. Later on, these weapons were used by Nepalese troops and four companies were established regular, namely, Srinath, Kalibox, Barda Bahadur (Bardabahini) and Sabuj. (Purano) Gorakh Company was established a few months later. It was the first rank and file system beginning a proper organizational history for the Royal Nepalese Army. The battle against Mir Qasim troops was the first battle of the Royal Nepalese Army against a foreign power. Sardar Nandu Shah was the fortress Commander of Makawanpur with 400 troops, some guns and home-made traditional weapons like Dhanu, Khukuri, Talwar, Ghuyatro etc. They devised different hit-and-run strategies to surprise the enemy. A spoiling attack base was set up on the Taplakhar mountain ridge for night operations. Mir Qasim's renowned warrior, Gurgin Khan was the commander on the other side with approximately 2,500 troops with cannons, guns, ammunition and a very good logistics back up. Their attack base was at the bottom of the Makawanpur Gadhi hill. They had planned a night attack. When the enemy's heavy forces marched in December 1762 and arrived at Harnamadi in January 1763, they found all the local houses already evacuated and the area short of food provisions. Makawanpur Gadhi was on top of a mountain, about nine kilometers uphill from the Harnamadi area. Although the Nepalese had physically occupied all the fortresses en route, the enemy was able to initially push them back to the Makawanpur Gadhi area. About 300 enemy launched a strong attack on 20 January 1763 putting the Nepalese still more on the defensive. But they were totally surprised when they were resting in Taplakhar, as Kaji Vamsharaj Pande led a downhill attack on them Kaju Naharsigh Basnyat led an uphill attack from below them and Nandu Shah led a frontal attack. The smooth coordination among the three, leading their, by now battle-hardened, troops in the dark of the night, led the bewildered enemy to scatter. About 1700 of them died and 30 Nepalese soldiers were lost in that battle. The Nepalese captured 500 rifles and two cannons with other military equipment. More importantly, the battle led to the beginning of a proper organization of the Royal Nepalese Army. Other major engagements Battle of Pauwa Gadhi against Captain Kinloch- 1767 AD Anglo-Nepal War 1814 AD First Nepal - Tibet War The relations started forming sour after the Malla rulers started to mint impure silver coins just before their downfall. The Tibetans demanded that the coins be replaced by pure silver ones. When Prithvi Narayan Shah took over, he found that it would be a great loss to him if he conceded to the Tibetan demands. That case remained unsolved due to his untimely demise. Queen Mother Rajendra Laxmi, the Regent of minor King Rana Bahadur Shah, inherited the coinage problem which reached the culminating point in 1888 AD. Another sore point in Nepal-Tibet relations was Nepal's decision to provide refuge to Syamarpa Lama with his 14 Tibetan followers. He had fled from Tibet to Nepal on religious and political grounds. Yet another cause for conflict was the low quality salt being provided by Tibetans to Nepal. All salt came from Tibet in those days. Tibet ignored the Nepalese ultimatums and that promoted the preparations for war. Nepal was soon preparing to launch multi-directional attacks. Kerung Axis: Kaji Balbhadra Shah was the main Commander of the offensive attack from Kerung axis. Kaji Kirtiman Singh Basnyat, Sardar Amar Singh Thapa and Kapardar Bhotu Pande were the subordinate commanders under him. Approximately 6,000 troops and 3,200 porters were despatched for this operation. Their main objective was to capture Dirgacha through Kerung. The march of the troops was delayed because Balbhadra Shah became seriously ill. They crossed Kerung on 20 July 1788 and captured Jhunga on the 3rd of August 1788. Kapardar Bhotu Pande was captured by the Tibetans. The Nepalese troops were reinforced with 2,000 more troops and Kapardar Bhotu Pande was freed from the Tibetans on 14 October 1788. Kuti Axis (I):Shree Krishna Shah was the Commander and Kaji Ranajit Pande, Sardar Parath Bhandari, Captain Harsa Panta, Captain Naharsingh Basnyat and Captain Shiva Narayan Khatri were the subordinate commanders under him. About 5,800 soldiers and 3,000 porters were allotted for the offensive operation. Later on, Kaji Abhimansingh Basnyat and Ranajit Kunwar also joined this offensive. The Dalai Lama was taken by surprise and to protect his sovereignty, he initiated a parallel approach whereby he asked military help from Sovan Shahi, the King of Jumla in West Nepal, and requested him to launch guerrilla activities and revolt against the Nepalese Army in and around Jumla. Sovan Shahi did revolt at Humla and captured some fortresses. The Dalai Lama also asked for military help from the Chinese Emperor. Additionally, he himself and Panchen Lama of Dirgacha wrote a secret letter to the East India Company seeking military assistance. The Tibetans also initiated propaganda about having constructed a new road through the Tigri valley and establishing a post at the front. They also rumoured that they had assembled an Army of 1,25,000 men. But the Tibetans could get nothing from Jumla, China or the East India Company. Kuti Axis (II):Kaji Damodar Pande was leading his troops with subordinate commanders Bom Shah, Dev Dutta Thapa and others. He was given about 4,000 troops and his objective was to capture Dirgacha via the Kuti axis. The Battles Nepalese troops, having crossed the Himalayas captured Chhochyang and Kuti in June 1788 and Sikarjong on 3 August 1788, in spite of many difficult logistic limitations. Later, Bahadur Shah was able to provide some reinforcements and improve some logistics arrangements. Still that was not enough and progress was slow. When the Nepalese were about to capture Dirgacha via both Kuti and Kerung, the Tibetans started to make compromises with Nepalese commanders. Bahadur Shah started negotiations, ultimately arriving at a solution. Prisoners were handed back to the Tibetans. Tibet was ready to pay tributes to the tune of Rs. 50,000 in silver coins per annum to Nepal and a treaty was signed on 2 June 1789 in Kerung. The treaty is called the ‘Treaty of Kerung’ by historians Rasuwa Gadhi and Timure were the firm bases in the first Nepal-Tibet war. Syabru Besi and Rasuwa Gadhi were Strategic points in this war. Likewise, Listi and Duguna villages were the main bases for offensive operations against Tibet. They were the forward most dumping places of the Royal Nepalese Army. Although Rasuwa Gadhi and Duguna Gadhi Fortresses were not constructed at the time, the places themselves were important because of their military significance. Nepal-Tibet/China War Nepalese-Tibetan War Foreign Involvements Royal Nepal Army in Indian Sepoy Mutiny Royal Nepal Army in The First World War 1914–1918 Royal Nepal Army in Waziristhan War Royal Nepal Army in Afghan War −1919 Royal Nepal Army in The Second World War Royal Nepal Army in Hyderabad Action – 1948 Domestic Operations Disarmament of the Khampas – 1974 In 1974, The Royal Nepalese Army (RNA) was mobilized to disarm the Tibetan Khampas who had been using Nepalese soil to engage guerilla war against the Chinese forces. The Khampas had secretly created their base in Mustang (north-west Nepal) |
kilometers border in the Himalayan range of the northern side of Nepal. Nepal has established its embassy in Beijing, opened consulates general in Lhasa, Hong Kong and Guangzhou and appointed an honorary consul in Shanghai. Economic Cooperation The Nepal-China economic cooperation dates back to the formalization of bilateral relations in 1950's. The first "Agreement between China and Nepal on "Economic Aid" was signed in October 1956. From the mid-80s the Chinese Government has been pledging grant assistance to the government of Nepal under the Economic and Technical Cooperation Program in order to implement mutually acceptable developmental projects. The Chinese assistance to Nepal falls into three categories: Grants (aid gratis), interest free loans and concessional loans. These assistance of various kinds would be provided to Nepal via: different sources. The Chinese financial and technical assistance to Nepal has been greatly contributed to Nepal's development efforts in the areas of infrastructure building, industrialization process, human resource development, health, education, water resources, sports and the like. Some of the major on-going projects under Chinese assistance include: 1. Upper Trishuli Hydropower Project- Power station and Transmission Line Projects (Concessional loan) 2. Food/ Material Assistance (Grant) in 15 bordering districts of northern Nepal. 3. Kathmandu Ring Road Improvement Project with Flyover Bridges -(Grant) 4. Tatopani Frontier Inspection Station Project (Construction of ICDs at Zhangmu-Kodari)- (Grant) 5. Pokhara International Regional Airport (Loan) With the signing of the Memorandum of Understanding on Cooperation under the Belt and Road Initiative on 12 May 2017 in Kathmandu between Nepal and China, new avenues for bilateral cooperation in the mutually agreed areas are expected to open. Nepal expects to upgrade its vital infrastructures, enhance cross-border connectivity with China and enhance people-to-people relations under this initiative. The major thrust of the MoU is to promote mutually beneficial cooperation between Nepal and China in various fields such as economy, environment, technology and culture. The MoU aims at promoting cooperation on policy exchanges, trade connectivity, financial integration and connectivity of people. The Government of the People's Republic of China provided substantial and spontaneous support in search, relief and rescue efforts of Nepal following the devastating earthquakes of 2015. China has provided 3 billion Yuan on Nepal's Reconstruction to be used in the jointly selected 25 major projects for 2016–2018 period. On 23 December 2016, Nepal and the People's Republic of China signed Agreement on Economic and Technical Cooperation in Beijing to provide grant assistance of RMB 1 billion to the Government of Nepal for implementing the Syaphrubesi-Rasuwagadhi Highway Repair and Improvement Project, Upgrading and Renovation Project of Civil Service Hospital, and Mutually agreed Post-Disaster Reconstruction Projects. The Letters of Exchange to initiate Syaphrubesi-Rasuwagadhi Highway Repair and Improvement Project was signed on May 9, 2017. Trade and Investment: China is the second largest trading partner of Nepal. In 2015/16, total exports to China stood at US$181 million with marginal increase from US$179 million in the previous fiscal year. In contrast, import from China has been growing at the rate of 39 per cent per year. It rose from US$421 million in fiscal year 2009/10 to US$1,247 million in fiscal year 2015/16. As a result, the trade deficit with China has risen from US$401 million in 2009/10 to US$1228 million in 2015/16. Although, China has given zero tariff entry facility to over 8000 Nepali products starting from 2009, Nepal has not been able to bring the trade deficit down. Nepal exports 370 products including noodles and agro products to China. Nepal regularly participates various trade fairs and exhibitions organized in China. Nepal-China's Tibet Economic and Trade Fair is the regular biannual event hosted by either side alternatively to enhance business interaction and promote economic cooperation between Nepal and TAR. The 15th Nepal China's Tibet Economic and Trade Fair was held on 17–22 November 2015 in Bhrikutimandap, Kathmandu Nepal. Nepal-China Non-Governmental Cooperation Forum established in 1996, which is led by the President of the Federation of the Nepali Chambers of Commerce and Industry (FNCCI) on the Nepali side and the Vice Head of the All-China Federation of Industry and Commerce (ACFIC) from the Chinese side. It is an initiative to mobilize the apex business organization of both sides to enhance cooperation between the private sectors of two sides. The 14th meeting of the Forum concluded in Kathmandu on 25–26 May 2017. China is the largest source of Foreign Direct Investment in Nepal. Chinese investors have shown intent to spend over $8.3 billion in Nepal during the Nepal Investment Summit concluded in Kathmandu in March 2017. Tourism: China is the 2nd largest source of foreign tourist to Nepal. Over 100 thousands Chinese tourists visit Nepal annually. China has designated Nepal as the first tourist destination in South Asia for its people. The Government of Nepal has waived visa fees for the Chinese tourist effective from 1 January 2016. The Chinese Government has announced the year 2017 as Nepal Tourism Promotion Year in China. Both sides have been carrying out joint efforts to promote Nepal in China and encourage Chinese enterprises to invest in Nepal's tourism sectors. Nepal has road connectivity via Rasuwagadhi and Zhangmu for trade and international travelers. There are 4 other border points designated for bilateral trade. Nepal has direct air link with Lhasa, Chengdu, Kunming, Guangzhou and Hong Kong SAR of China. Education and Cultural Cooperation: China provides scholarships every year not exceeding a total of 100 Nepali students studying in China. The Chinese side has been providing Chinese language training for 200 tourism entrepreneurs of Nepal for the next five years as per the understanding reached between two sides in March 2016. Both sides have been carrying out activities in culture and youth sectors as per the provisions of the MoU on Cultural Cooperation-1999 and MoU on Youth Exchange-2009. Both sides have been promoting people-to-people relations through regular hosting of cultural festival, friendly visits of the peoples of different walks of public life, exhibition, cultural and film show, food festivals etc. Sister city relations between the cities of two countries are growing and both sides have agreed to push cooperation though such relations. These relations are basically meant for carrying out exchanges and cooperation in the fields of economy, trade, transportation, science and technology, culture, tourism, education, sports and health, personnel, etc. Regional and International Affairs: Nepal is the founding member of the AIIB. Nepal holds the observer status in the Shanghai Cooperation Organization. Both countries are also the member of the Asia Cooperation Dialogue. China is the observer of the SAARC. Both countries have been cooperating each other in various regional and UN forums on the matters of common concerns. Though Nepal initially let Tibetans Khampa rebels to make use of Nepalese territory in early 1960s, bilateral relations have generally been very good from 1975 onwards, after annexation of Kingdom of Sikkim by India in 1975. As many as twenty thousand Tibetan refugees live in Nepal and this has been a major issue of concern between China and Nepal. Kathmandu has in several instances been cracking down on the activities of the Tibetans receiving international condemnation. In 2005, Nepalese Foreign Minister Ramesh Nath Pandey called China "an all weather friend" and King Gyanendra's regime was also instrumental in inducting China into the SAARC. Nepalese in general, hold a positive view about the influence of China. In recent years, China has been one of the largest aid donors to Nepal just behind the UK. China is also Nepal's largest source of FDI. Denmark See Denmark–Nepal relations. European Union Nepal formally tied diplomatic relations with the EU in 1975. EU established its Technical Office at Kathmandu in 1992. Nepal established residential embassy in Brussels in 1992. EU Delegation office in Kathmandu has been upgraded to the Ambassadorial level since 2009 December. Development Cooperation: EU is the largest development partner and the second largest trade partner (if taken as a single trade bloc) of Nepal. Until 2013, EU assistance to Nepal was provided in two main ways: on a bilateral basis through the formulation of successive Country Strategy Papers (CSPs) in close partnership with the Government of Nepal and on a multilateral basis including all actions outside the CSP mainly funded through thematic budget lines. Looking at the history of CSP's for Nepal, the first CSP 2001–2006 allocated €70 million, and, the second CSP 2007–2013 allocated €114 million for Nepal. Cumulative contribution from EU to Nepal's development has reached 360 million Euro spread over more than 70 projects till 2013. Starting from 2014, the EU has begun channeling its development cooperation under its Multi-Annual Indicative Program (MIP). The EU has increased its development cooperation to Nepal by threefold for the current period of 2014–2020 compared to the proceeding period of the same duration. The MIP had identified three focal sectors for Nepal: €146 million for sustainable rural development, with focus on agricultural productivity and value addition, job creation, market access infrastructure, and nutrition (40.5%); €136.4 million for education, with the aim to improving basic education, quality, livelihood skills and equity for vulnerable and disadvantaged (38%); €74 million for strengthening democracy and decentralization, including its engagement in the area of public finance management reform efforts of the government at local and national level (20.5%); and remaining €3.6 million for other support measures (1%). The EU is also a major donor partner of the Nepal Peace Trust Fund. Cooperation with European Investment Bank (EIB): Nepal and EIB signed an umbrella agreement for financial cooperation on 7 May 2012 paving the way for major investments from EIB in Nepal's infrastructure and energy sectors. Following the agreement EIB has already committed a loan assistance of Euro 55 million for Tanahu Hydropower Project (140 MW) which has a total cost of US$500 million. EIB has also expressed its commitments on immediate additional concessional loan assistance of Rs. 1.5 billion for the same project. Talks are underway for EIB investment of $120 million for Kaligandaki-Marsyangdi Corridor Transmission Project, and $30 million for upgrading the Trishuli Corridor Transmission Line. Ms. Magdalena Alvarez Arza, Vice President of the European Investment Bank visited Nepal in June 2014 to work out on those commitments. Trade: The EU is one of the principal trading partners of Nepal, second largest export market with 13% share. The EU imports mainly handmade carpets, textile, gems and jewellery, wood and paper products, leather products, etc. from Nepal. Nepal imports engineering goods, telecommunication equipment, chemical and minerals, metal and steel, agricultural products, etc. from the EU countries. The EU started providing duty-free and quota-free facilities to the Nepalese exports under its Everything But Arms (EBA) policy for the LDCs from 2001. EU introduced the new Generalised System of Preferences (GSP) in 2006 which will remain valid till 2015. Under this scheme, for nearly 2,100 products out of 11,000, except arms and ammunitions, the EU duty rate will be zero. Humanitarian Aid: The European Commission is one of the biggest sources of humanitarian aid to Nepal. It has long been associated with the efforts of disaster management and mitigation projects in Nepal. EU Humanitarian Aid and Civil Protection Department (ECHO) has provided over Euro 74 million worth of humanitarian aid to Nepal since 2001 A.D. As a gesture of European solidarity to help those who are worst affected by the recent monsoon floods, the EU Delegation office in Kathmandu has provided Euro 250,000 assistance to flood affected people of mid-western region of Nepal in September 2014. Finland France Nepal and the French Republic entered into diplomatic relations on 20 April 1949. Bilateral economic cooperation programme commenced in February 1981 when the two countries signed the First Protocol amounting to French Franc 50 million loan which was converted into debt in 1989. Food aid and the counterpart funds that it generated have been the main form of aid since 1991. Main areas of cooperation are national seismologic network, petroleum exploration, restructuring of Water Supply Corporation, the Kavre Integrated Project and Gulmi and Arghakhanchi Rural Development Project, rehabilitation of airports, ‘food for work’, and others. Nepal and France have signed an agreement concerning Reciprocal Promotion and Protection of Investment in 1983. The major areas of French investment are hotels, restaurants, medicine, aluminium windows and doors, vehicle body building sectors. Alcatelhad became the leading supplier of the Nepal Telecommunication Corporation, with 200,000 lines installed, and fibre optic cables. Cegelec secured a 24 million dollars contract in respect of the construction of Kali Gandaki hydroelectric project. The Government of Nepal awarded a contract to Oberthur Technologies of France in 2010, for printing, supply, and delivery of Machine Readable Passports. A significant number of French tourists (24,097 in 2014, 16, 405 in 2015, and, 20,863 in 2016) arrive in Nepal from France each year. India As close neighbours, India and Nepal share a unique relationship of friendship and cooperation characterized by open borders and deep-rooted people-to-people contacts of kinship and culture. There has been a long tradition of free movement of people across the borders. The India-Nepal Treaty of Peace and Friendship of 1950 forms the bedrock of the special relations that exist between India and Nepal. Political: Beginning with the 12-Point understanding reached between the Seven Party Alliance and the Maoists at Delhi in November 2005, Government of India has welcomed the road-map laid down by the historic Comprehensive Peace Agreement of November 2006 towards political stabilization in Nepal, through peaceful reconciliation and inclusive democratic processes. India has consistently responded with a sense of urgency to the needs of the people and Government of Nepal in ensuring the success of the peace process and institutionalization of multi–party democracy through the framing of a new Constitution by a duly elected Constituent Assembly. India has always believed that only an inclusive Constitution with the widest possible consensus by taking on board all stakeholders would | political, economic, security and all possible assistance from Bangladesh while dealing with Nepal's hegemonic neighbor India to address Nepal's interest, as Nepal on its own lacks the economic and diplomatic weight to deal with India. However, people familiar with the political culture of politics in Nepal remain highly skeptical of such a possibility and instead point to the fact that Nepal is on the verge of losing even more of its strategic autonomy because of the insertion of Indian fifth column – the madheshis, in Nepal's power structure. Bhutan Relations with Bhutan have been strained since 1992 over the nationality and possible repatriation of refugees from Bhutan. Canada Many Nepalese politicians and government officials criticized Canadian diplomats in the aftermath of the Kabul attack on Canadian Embassy guards in which the majority of victims were Nepalese citizens. Members of Parliament were among those who were critical of the way that Canada treated its security contractors at the embassy, leading to meetings in Ottawa between Nepalese and Canadian diplomats, including ambassador Nadir Patel. China Nepal formally established relations with the People's Republic of China on August 1, 1955. The two countries share a range of 1414 kilometers border in the Himalayan range of the northern side of Nepal. Nepal has established its embassy in Beijing, opened consulates general in Lhasa, Hong Kong and Guangzhou and appointed an honorary consul in Shanghai. Economic Cooperation The Nepal-China economic cooperation dates back to the formalization of bilateral relations in 1950's. The first "Agreement between China and Nepal on "Economic Aid" was signed in October 1956. From the mid-80s the Chinese Government has been pledging grant assistance to the government of Nepal under the Economic and Technical Cooperation Program in order to implement mutually acceptable developmental projects. The Chinese assistance to Nepal falls into three categories: Grants (aid gratis), interest free loans and concessional loans. These assistance of various kinds would be provided to Nepal via: different sources. The Chinese financial and technical assistance to Nepal has been greatly contributed to Nepal's development efforts in the areas of infrastructure building, industrialization process, human resource development, health, education, water resources, sports and the like. Some of the major on-going projects under Chinese assistance include: 1. Upper Trishuli Hydropower Project- Power station and Transmission Line Projects (Concessional loan) 2. Food/ Material Assistance (Grant) in 15 bordering districts of northern Nepal. 3. Kathmandu Ring Road Improvement Project with Flyover Bridges -(Grant) 4. Tatopani Frontier Inspection Station Project (Construction of ICDs at Zhangmu-Kodari)- (Grant) 5. Pokhara International Regional Airport (Loan) With the signing of the Memorandum of Understanding on Cooperation under the Belt and Road Initiative on 12 May 2017 in Kathmandu between Nepal and China, new avenues for bilateral cooperation in the mutually agreed areas are expected to open. Nepal expects to upgrade its vital infrastructures, enhance cross-border connectivity with China and enhance people-to-people relations under this initiative. The major thrust of the MoU is to promote mutually beneficial cooperation between Nepal and China in various fields such as economy, environment, technology and culture. The MoU aims at promoting cooperation on policy exchanges, trade connectivity, financial integration and connectivity of people. The Government of the People's Republic of China provided substantial and spontaneous support in search, relief and rescue efforts of Nepal following the devastating earthquakes of 2015. China has provided 3 billion Yuan on Nepal's Reconstruction to be used in the jointly selected 25 major projects for 2016–2018 period. On 23 December 2016, Nepal and the People's Republic of China signed Agreement on Economic and Technical Cooperation in Beijing to provide grant assistance of RMB 1 billion to the Government of Nepal for implementing the Syaphrubesi-Rasuwagadhi Highway Repair and Improvement Project, Upgrading and Renovation Project of Civil Service Hospital, and Mutually agreed Post-Disaster Reconstruction Projects. The Letters of Exchange to initiate Syaphrubesi-Rasuwagadhi Highway Repair and Improvement Project was signed on May 9, 2017. Trade and Investment: China is the second largest trading partner of Nepal. In 2015/16, total exports to China stood at US$181 million with marginal increase from US$179 million in the previous fiscal year. In contrast, import from China has been growing at the rate of 39 per cent per year. It rose from US$421 million in fiscal year 2009/10 to US$1,247 million in fiscal year 2015/16. As a result, the trade deficit with China has risen from US$401 million in 2009/10 to US$1228 million in 2015/16. Although, China has given zero tariff entry facility to over 8000 Nepali products starting from 2009, Nepal has not been able to bring the trade deficit down. Nepal exports 370 products including noodles and agro products to China. Nepal regularly participates various trade fairs and exhibitions organized in China. Nepal-China's Tibet Economic and Trade Fair is the regular biannual event hosted by either side alternatively to enhance business interaction and promote economic cooperation between Nepal and TAR. The 15th Nepal China's Tibet Economic and Trade Fair was held on 17–22 November 2015 in Bhrikutimandap, Kathmandu Nepal. Nepal-China Non-Governmental Cooperation Forum established in 1996, which is led by the President of the Federation of the Nepali Chambers of Commerce and Industry (FNCCI) on the Nepali side and the Vice Head of the All-China Federation of Industry and Commerce (ACFIC) from the Chinese side. It is an initiative to mobilize the apex business organization of both sides to enhance cooperation between the private sectors of two sides. The 14th meeting of the Forum concluded in Kathmandu on 25–26 May 2017. China is the largest source of Foreign Direct Investment in Nepal. Chinese investors have shown intent to spend over $8.3 billion in Nepal during the Nepal Investment Summit concluded in Kathmandu in March 2017. Tourism: China is the 2nd largest source of foreign tourist to Nepal. Over 100 thousands Chinese tourists visit Nepal annually. China has designated Nepal as the first tourist destination in South Asia for its people. The Government of Nepal has waived visa fees for the Chinese tourist effective from 1 January 2016. The Chinese Government has announced the year 2017 as Nepal Tourism Promotion Year in China. Both sides have been carrying out joint efforts to promote Nepal in China and encourage Chinese enterprises to invest in Nepal's tourism sectors. Nepal has road connectivity via Rasuwagadhi and Zhangmu for trade and international travelers. There are 4 other border points designated for bilateral trade. Nepal has direct air link with Lhasa, Chengdu, Kunming, Guangzhou and Hong Kong SAR of China. Education and Cultural Cooperation: China provides scholarships every year not exceeding a total of 100 Nepali students studying in China. The Chinese side has been providing Chinese language training for 200 tourism entrepreneurs of Nepal for the next five years as per the understanding reached between two sides in March 2016. Both sides have been carrying out activities in culture and youth sectors as per the provisions of the MoU on Cultural Cooperation-1999 and MoU on Youth Exchange-2009. Both sides have been promoting people-to-people relations through regular hosting of cultural festival, friendly visits of the peoples of different walks of public life, exhibition, cultural and film show, food festivals etc. Sister city relations between the cities of two countries are growing and both sides have agreed to push cooperation though such relations. These relations are basically meant for carrying out exchanges and cooperation in the fields of economy, trade, transportation, science and technology, culture, tourism, education, sports and health, personnel, etc. Regional and International Affairs: Nepal is the founding member of the AIIB. Nepal holds the observer status in the Shanghai Cooperation Organization. Both countries are also the member of the Asia Cooperation Dialogue. China is the observer of the SAARC. Both countries have been cooperating each other in various regional and UN forums on the matters of common concerns. Though Nepal initially let Tibetans Khampa rebels to make use of Nepalese territory in early 1960s, bilateral relations have generally been very good from 1975 onwards, after annexation of Kingdom of Sikkim by India in 1975. As many as twenty thousand Tibetan refugees live in Nepal and this has been a major issue of concern between China and Nepal. Kathmandu has in several instances been cracking down on the activities of the Tibetans receiving international condemnation. In 2005, Nepalese Foreign Minister Ramesh Nath Pandey called China "an all weather friend" and King Gyanendra's regime was also instrumental in inducting China into the SAARC. Nepalese in general, hold a positive view about the influence of China. In recent years, China has been one of the largest aid donors to Nepal just behind the UK. China is also Nepal's largest source of FDI. Denmark See Denmark–Nepal relations. European Union Nepal formally tied diplomatic relations with the EU in 1975. EU established its Technical Office at Kathmandu in 1992. Nepal established residential embassy in Brussels in 1992. EU Delegation office in Kathmandu has been upgraded to the Ambassadorial level since 2009 December. Development Cooperation: EU is the largest development partner and the second largest trade partner (if taken as a single trade bloc) of Nepal. Until 2013, EU assistance to Nepal was provided in two main ways: on a bilateral basis through the formulation of successive Country Strategy Papers (CSPs) in close partnership with the Government of Nepal and on a multilateral basis including all actions outside the CSP mainly funded through thematic budget lines. Looking at the history of CSP's for Nepal, the first CSP 2001–2006 allocated €70 million, and, the second CSP 2007–2013 allocated €114 million for Nepal. Cumulative contribution from EU to Nepal's development has reached 360 million Euro spread over more than 70 projects till 2013. Starting from 2014, the EU has begun channeling its development cooperation under its Multi-Annual Indicative Program (MIP). The EU has increased its development cooperation to Nepal by threefold for the current period of 2014–2020 compared to the proceeding period of the same duration. The MIP had identified three focal sectors for Nepal: €146 million for sustainable rural development, with focus on agricultural productivity and value addition, job creation, market access infrastructure, and nutrition (40.5%); €136.4 million for education, with the aim to improving basic education, quality, livelihood skills and equity for vulnerable and disadvantaged (38%); €74 million for strengthening democracy and decentralization, including its engagement in the area of public finance management reform efforts of the government at local and national level (20.5%); and remaining €3.6 million for other support measures (1%). The EU is also a major donor partner of the Nepal Peace Trust Fund. Cooperation with European Investment Bank (EIB): Nepal and EIB signed an umbrella agreement for financial cooperation on 7 May 2012 paving the way for major investments from EIB in Nepal's infrastructure and energy sectors. Following the agreement EIB has already committed a loan assistance of Euro 55 million for Tanahu Hydropower Project (140 MW) which has a total cost of US$500 million. EIB has also expressed its commitments on immediate additional concessional loan assistance of Rs. 1.5 billion for the same project. Talks are underway for EIB investment of $120 million for Kaligandaki-Marsyangdi Corridor Transmission Project, and $30 million for upgrading the Trishuli Corridor Transmission Line. Ms. Magdalena Alvarez Arza, Vice President of the European Investment Bank visited Nepal in June 2014 to work out on those commitments. Trade: The EU is one of the principal trading partners of Nepal, second largest export market with 13% share. The EU imports mainly handmade carpets, textile, gems and jewellery, wood and paper products, leather products, etc. from Nepal. Nepal imports engineering goods, telecommunication equipment, chemical and minerals, metal and steel, agricultural products, etc. from the EU countries. The EU started providing duty-free and quota-free facilities to the Nepalese exports under its Everything But Arms (EBA) policy for the LDCs from 2001. EU introduced the new Generalised System of Preferences (GSP) in 2006 which will remain valid till 2015. Under this scheme, for nearly 2,100 products out of 11,000, except arms and ammunitions, the EU duty rate will be zero. Humanitarian Aid: The European Commission is one of the biggest sources of humanitarian aid to Nepal. It has long been associated with the efforts of disaster management and mitigation projects in Nepal. EU Humanitarian Aid and Civil Protection Department (ECHO) has provided over Euro 74 million worth of humanitarian aid to Nepal since 2001 A.D. As a gesture of European solidarity to help those who are worst affected by the recent monsoon floods, the EU Delegation office in Kathmandu has provided Euro 250,000 assistance to flood affected people of mid-western region of Nepal in September 2014. Finland France Nepal and the French Republic entered into diplomatic relations on 20 April 1949. Bilateral economic cooperation programme commenced in February 1981 when the two countries signed the First Protocol amounting to French Franc 50 million loan which was converted into debt in 1989. Food aid and the counterpart funds that it generated have been the main form of aid since 1991. Main areas of cooperation are national seismologic network, petroleum exploration, restructuring of Water Supply Corporation, the Kavre Integrated Project and Gulmi and Arghakhanchi Rural Development Project, rehabilitation of airports, ‘food for work’, and others. Nepal and France have signed an agreement concerning Reciprocal Promotion and Protection of Investment in 1983. The major areas of French investment are hotels, restaurants, medicine, aluminium windows and doors, vehicle body building sectors. Alcatelhad became the leading supplier of the Nepal Telecommunication Corporation, with |
the different island territories, and the Islands Regulation was older than the Constitution, many scholars describe the Netherlands Antilles as a federal arrangement. The head of state was the monarch of the Kingdom of the Netherlands, who was represented in the Netherlands Antilles by a governor. The governor and the council of ministers, chaired by a prime minister, formed the government. The Netherlands Antilles had a unicameral legislature called the Parliament of the Netherlands Antilles. Its 22 members were fixed in number for the islands making up the Netherlands Antilles: fourteen for Curaçao, three each for Sint Maarten and Bonaire, and one each for Saba and Sint Eustatius. The Netherlands Antilles were not part of the European Union, but instead listed as overseas countries and territories (OCTs). This status was kept for all the islands after dissolution, and will be kept until at least 2015. Economy Tourism, petroleum transshipment and oil refinement (on Curaçao), as well as offshore finance were the mainstays of this small economy, which was closely tied to the outside world. The islands enjoyed a high per capita income and a well-developed infrastructure as compared with other countries in the region. Almost all consumer and capital goods were imported, with Venezuela, the United States, and Mexico being the major suppliers, as well as the Dutch government which supports the islands with substantial development aid. Poor soils and inadequate water supplies hampered the development of agriculture. The Antillean guilder had a fixed exchange rate with the United States dollar of 1.79:1. Demographics A large percentage of the Netherlands Antilleans descended from European colonists and African slaves who were brought and traded here from the 17th to 19th centuries. The rest of the population originated from other Caribbean islands as well as Latin America, East Asia and elsewhere in the world. In Curaçao there was a strong Jewish element going back to the 17th century slave trade. The language Papiamentu was predominant on Curaçao and Bonaire (as well as the neighboring island of Aruba). This creole descended from Portuguese and West African languages with a strong admixture of Dutch, plus subsequent lexical contributions from Spanish and English. An English-based creole dialect, formally known as Netherlands Antilles Creole, was the native dialect of the inhabitants of Sint Eustatius, Saba and Sint Maarten. After a decades-long debate, English and Papiamentu were made official languages alongside Dutch in early March 2007. Legislation was produced in Dutch, but parliamentary debate was in Papiamentu or English, depending on the island. Due to a massive influx of immigrants from Spanish-speaking territories such as the Dominican Republic in the Windward Islands, and increased tourism from Venezuela in the Leeward Islands, Spanish had also become increasingly used. The majority of the population were followers of the Christian faith, with a Protestant majority in Sint Eustatius and Sint Maarten, and a Roman Catholic majority in Bonaire, Curaçao and Saba. Curaçao also hosted a sizeable group of followers of the Jewish religion, descendants of a Portuguese group of Sephardic Jews that arrived from Amsterdam and Brazil from 1654. In 1982, there was a population of about 2,000 Muslims, with an Islamic association and a mosque in the capital. Most Netherlands Antilleans were Dutch citizens and this status permitted and encouraged the young and university-educated to emigrate to the Netherlands. This exodus was considered to be to the islands' detriment, as it created a brain drain. On the other hand, immigrants from the Dominican Republic, Haiti, the Anglophone Caribbean and Colombia had increased their presence on these islands in later years. Antillean diaspora in the Netherlands Culture The origins of the population and location of the islands gave the Netherlands Antilles a mixed culture. Tourism and overwhelming media presence from the United States increased the regional United States influence. On all the islands, the holiday of Carnival had become an important event after its importation from other Caribbean and Latin American countries in the 1960s. Festivities included "jump-up" parades with beautifully colored costumes, floats, and live bands as well as beauty contests and other competitions. Carnival on the islands also included a middle-of-the-night j'ouvert (juvé) parade that ended at sunrise with the burning of a straw King Momo, cleansing the island of sins and bad luck. Sports Netherlands Lesser Antilles competed in the Winter Olympics of 1988, notably finishing 29th in the bobsled, ahead of Jamaica who famously competed but finished 30th. Baseball is by far the most popular sport. Several players have made it to the Major Leagues, such as Xander Bogaerts, Andrelton Simmons, Hensley Meulens, Randall Simon, Andruw Jones, Kenley Jansen, Jair Jurrjens, Roger Bernadina, Sidney Ponson, Didi Gregorius, Shairon Martis, Wladimir Balentien, and Yurendell DeCaster. Xander Bogaerts competed in the 2013 World Series for the Boston Red Sox against the St. Louis Cardinals. Andruw Jones played for the Atlanta Braves in the 1996 World Series hitting two home runs in his first game against the New York Yankees. Three athletes from the former Netherlands Antilles competed in the 2012 Summer Olympics. They, alongside one athlete from South Sudan, competed under the banner of Independent Olympic Athletes. The Netherlands Antilles, though a non-existing entity since 2010, are allowed to field teams at the Chess Olympiad under this name, because the Curaçao Chess Federation remains officially registered as representing the dissolved country in the FIDE Directory. Miscellaneous topics Unlike the metropolitan Netherlands, same-sex marriages were not performed in the Netherlands Antilles, but those performed in other jurisdictions were recognised. The main prison of the Netherlands Antilles was Koraal Specht, later known as Bon Futuro. It was known for ill treatment of prisoners and bad conditions throughout the years. The late Venezuelan President Hugo Chávez claimed that the Netherlands was helping the United States to invade Venezuela due to military games in 2006. See also Index of Netherlands Antilles-related articles Outline of the Netherlands Antilles British West Indies Danish West Indies French West Indies Spanish West Indies Notes References Borman, C. (2005) Het Statuut voor het Koninkrijk, Deventer: Kluwer. Oostindie, G. and Klinkers, I. (2001) Het Koninkrijk inde Caraïben: een korte geschiedenis van het Nederlandse dekolonisatiebeleid 1940–2000. Amsterdam: Amsterdam University Press. External links Government GOV.an – Main governmental site Antillenhuis – Cabinet of the Netherlands Antilles' Plenipotentiary Minister in the Netherlands Central Bank of the Netherlands Antilles General information Netherlands Antilles . The World Factbook. Central Intelligence Agency. Netherlands Antilles from UCB Libraries GovPubs History Method of | Leeward Islands. In the 17th century the islands were conquered by the Dutch West India Company and colonized by Dutch settlers. From the last quarter of the 17th century, the group consisted of six Dutch islands: Curaçao (settled in 1634), Aruba (settled in 1636), Bonaire (settled in 1636), Sint Eustatius (settled in 1636), Saba (settled in 1640) and Sint Maarten (settled in 1648). In the past, Anguilla (1631–1650), the present-day British Virgin Islands (1612–1672), St. Croix and Tobago had also been Dutch. During the American Revolution Sint Eustatius, along with Curaçao, was a major trade center in the Caribbean, with Sint Eustatius a major source of supplies for the Thirteen Colonies. It had been called "the Golden Rock" because of the number of wealthy merchants and volume of trade there. The British sacked its only town, Oranjestad, in 1781 and the economy of the island never recovered. Unlike many other regions, few immigrants went to the Dutch islands, due to the weak economy. However, with the discovery of oil in Venezuela in the nineteenth century, the Anglo-Dutch Shell Oil Company established refineries in Curaçao, while the U.S. processed Venezuelan crude oil in Aruba. This resulted in booming economies on the two islands, which turned to bust in the 1980s when the oil refineries were closed. The various islands were united as a single country – the Netherlands Antilles – in 1954, under the Dutch crown. The country was dissolved on 10 October 2010. Curaçao and Sint Maarten became distinct constituent countries alongside Aruba which had become a distinct constituent country in 1986; whereas Bonaire, Sint Eustatius, and Saba (the "BES Islands") became special municipalities within the Netherlands proper. From 1815 onwards Curaçao and Dependencies formed a colony of the Kingdom of the Netherlands. Slavery was abolished in 1863, and in 1865 a government regulation for Curaçao was enacted that allowed for some very limited autonomy for the colony. Although this regulation was replaced by a constitution () in 1936, the changes to the government structure remained superficial and Curaçao continued to be ruled as a colony. The island of Curaçao was hit hard by the abolition of slavery in 1863. Its prosperity (and that of neighboring Aruba) was restored in the early 20th century with the construction of oil refineries to service the newly discovered Venezuelan oil fields. Colonial rule ended after the conclusion of the Second World War. Queen Wilhelmina had promised in a 1942 speech to offer autonomy to the overseas territories of the Netherlands. During the war, the British and American occupation of the islands—with the consent of the Dutch government—led to increasing demands for autonomy within the population as well. In May 1948 a new constitution for the territory entered into force, allowing the largest amount of autonomy possible under the Dutch constitution of 1922. Among other things, universal suffrage was introduced. The territory was also renamed "Netherlands Antilles". After the Dutch constitution was revised in 1948, a new interim Constitution of the Netherlands Antilles was enacted in February 1951. Shortly afterwards, on 3 March 1951, the Island Regulation of the Netherlands Antilles () was issued by royal decree, giving fairly wide autonomy to the various island territories in the Netherlands Antilles. A consolidated version of this regulation remained in force until the dissolution of the Netherlands Antilles in 2010. The new constitution was only deemed an interim arrangement, as negotiations for a Charter for the Kingdom were already under way. On 15 December 1954 the Netherlands Antilles, Suriname and the Netherlands acceded as equal partners to an overarching Kingdom of the Netherlands, established by the Charter for the Kingdom of the Netherlands. With this move, the United Nations deemed decolonization of the territory complete and removed the Netherlands Antilles from the United Nations list of Non-Self-Governing Territories. Aruba seceded from the Netherlands Antilles on 1 January 1986, paving the way for a series of referenda among the remaining islands on the future of the Netherlands Antilles. Whereas the ruling parties campaigned for the dissolution of the Netherlands Antilles, the people voted for a restructuring of the Netherlands Antilles. The coalition campaigning for this option became the Party for the Restructured Antilles, which ruled the Netherlands Antilles for much of the time until its dissolution on 10 October 2010. Dissolution Even though the referendums held in the early 1990s resulted in a vote in favour of retaining the Netherlands Antilles, the arrangement continued to be an unhappy one. Between June 2000 and April 2005, each island of the Netherlands Antilles had a new referendum on its future status. The four options that could be voted on were the following: closer ties with the Netherlands remaining within the Netherlands Antilles autonomy as a country within the Kingdom of the Netherlands (status aparte) independence Of the five islands, Sint Maarten and Curaçao voted for status aparte, Saba and Bonaire voted for closer ties with the Netherlands, and Sint Eustatius voted to stay within the Netherlands Antilles. On 26 November 2005, a Round Table Conference (RTC) was held between the governments of the Netherlands, Aruba, the Netherlands Antilles, and each island in the Netherlands Antilles. The final statement to emerge from the RTC stated that autonomy for Curaçao and Sint Maarten, plus a new status for Bonaire, Sint Eustatius, and Saba (BES) would come into effect by 1 July 2007. On 12 October 2006, the Netherlands reached an agreement with Bonaire, Sint Eustatius, and Saba: this agreement would make these islands special municipalities. On 3 November 2006, Curaçao and Sint Maarten were granted autonomy in an agreement, but this agreement was rejected by the then island council of Curaçao on 28 November. The Curaçao government was not sufficiently convinced that the agreement would provide enough autonomy for Curaçao. On 9 July 2007 the new island council of Curaçao approved the agreement previously rejected in November 2006. A subsequent referendum approved the agreement as well. The acts of parliament integrating the "BES" islands (Bonaire, Sint Eustatius and Saba) into the Netherlands were given royal assent on 17 May 2010. After ratification by the Netherlands (6 July), the Netherlands Antilles (20 August), and Aruba (4 September), the Kingdom act amending the Charter for the Kingdom of the Netherlands with regard to the dissolution of the Netherlands Antilles was signed by the three countries in the closing Round Table Conference on 9 September 2010 in The Hague. Political grouping Constitutional grouping at time of dissolution The Island Regulation had divided the Netherlands Antilles into four island territories: Aruba, Bonaire, Curaçao (ABC), and the islands in the Leeward Islands. In 1983, the island territory of the Leeward was split up to form the new island territories of Sint Maarten, Saba, and Sint Eustatius (SSS). In 1986, Aruba seceded from the Netherlands Antilles, reducing the number of island territories to five. After the dissolution of the Netherlands Antilles in 2010, Curaçao and Sint Maarten became autonomous countries within the Kingdom and Bonaire, Sint Eustatius and Saba (BES) became special municipalities of the Netherlands. Current constitutional grouping The islands of the former country of the Netherlands Antilles are currently divided are two main groups for political and constitutional purposes: those islands that have the status of constituent country of the Kingdom of the Netherlands those islands that have the status of special municipality of the Netherlands alone, as distinct from the Kingdom in its entirety. There are also several smaller islands, like Klein Curaçao and Klein Bonaire, that belong to one of the island countries or special municipalities. Constituent countries There are three Caribbean islands that are countries () within the Kingdom of the Netherlands: Aruba, Curaçao, and Sint Maarten. (The Netherlands is the fourth constituent country in the Kingdom of the Netherlands.) Sint Maarten covers approximately 40% of the island of Saint Martin; the remaining northern part of the island—the Collectivity of Saint-Martin—is an overseas territory of France. Special municipalities There are three Caribbean islands that are special municipalities of the Netherlands alone: Bonaire, Sint Eustatius, and |
had 364 kilometers (432 km before 1986) of coastline. Climate Tropical; ameliorated by northeast trade winds. Statistics Maritime claims: exclusive fishing zone: territorial sea: Terrain: generally hilly, volcanic interiors Elevation extremes: lowest point: Caribbean Sea 0 m highest point: Mount Scenery Natural resources: phosphates (Curaçao only), salt (Bonaire only) Land use: arable land: 10% permanent crops: 0% permanent pastures: 0% forests and woodland: 0% other: 90% (1993 est.) Irrigated land: NA km² Natural hazards: Curaçao and Bonaire are south of Caribbean hurricane belt and are | the Netherlands Antilles was dissolved. Its only land boundary was with France on the island of Saint Martin, which was 10.2 kilometers in length. The Netherlands Antilles had 364 kilometers (432 km before 1986) of coastline. Climate Tropical; ameliorated by northeast trade winds. Statistics Maritime claims: exclusive fishing zone: territorial sea: Terrain: generally hilly, volcanic interiors Elevation extremes: lowest point: Caribbean Sea 0 m highest point: Mount Scenery Natural resources: phosphates (Curaçao only), salt (Bonaire only) Land use: arable land: 10% permanent crops: 0% permanent pastures: 0% forests and woodland: 0% other: |
over: 0,7 male(s)/female total population: 0,93 male(s)/female (2006 est.) Infant mortality rate: 9,76 deaths/1,000 live births (2006 est.) Life expectancy at birth: total population: 76,03 years male: 73,76 years female: 78,41 years (2006 est.) Total fertility rate: 1.98 children born/woman (2008 est.) Nationality: by law: Dutch (Nederlandse) noun: Netherlands Antillean(s) adjective: Netherlands Antillean Ethnic groups: mixed black 85%, Carib Amerindian, white, East Asian 15% Religions: Roman Catholic 72%, Pentecostal 4,9%, Protestant 3.5%, Seventh-day Adventist 3,1%, Methodist 2,9%, other Christian 4,2%, Jehovah's Witnesses 1,7%, Jewish 1,3% Languages: Dutch, English and Papiamento are official languages. Papiamento (a Portuguese-West African creole with Dutch and Spanish influence) predominates on Curaçao and Bonaire, while English is widely spoken. English is the most commonly spoken language on Sint Maarten, Saba, and Sint Eustatius. Literacy: definition: age 15 and over can read and | the populace, economic status, religious affiliations and other aspects of the population. Population of the Islands According to the official estimates of the Central Bureau of Statistics of the Netherlands Antilles, the five islands had a combined population of 211,871 as at 1 January 2013. The population of the individual islands was as follows: Bonaire - 17,408 Curaçao - 154,843 Saba - 1,991 Sint Eustatius - 4,020 Sint Maarten - 33,609 For comparison: Aruba - 103,400 CIA World Factbook demographic statistics The following demographic statistics are from the CIA World Factbook, unless otherwise indicated. The capital and largest city was Willemstad. Age structure: 0–14 years: 23,9% (male 27 197; female 25 886) 15–64 years: 67.3% (male 71 622; female 77 710) |
refining (Curaçao), petroleum transhipment facilities (Curaçao and Bonaire), light manufacturing (Curaçao) Industrial production growth rate: NA% Electricity - production: 1 005 GWh (2004) Electricity - production by source: fossil fuel: 100% hydro: 0% nuclear: 0% other: 0% (1998) Electricity - consumption: 934,7 GWh (2004) Electricity - exports: 0 kWh (2004) Electricity - imports: 0 kWh (2004) Agriculture - products: aloes, sorghum, peanuts, vegetables, tropical fruit Exports and Imports Exports: $2.076 million (f.o.b., 2004) Exports - commodities: petroleum products Exports - partners: US 32%, Panama 10.1%, Guatemala 7,9%, Haiti 6,4%, The Bahamas 5,1% (2005) Imports: $4.383 billion (c.i.f., 2004) Imports - commodities: crude petroleum, food, manufactures Imports - partners: Venezuela 50%, US 22,2%, Italy 5.2%, Netherlands 5% (2005) Foreign Debt and Economic Aid Debt - external: $2 680 million (2004) Economic aid - recipient: IMF provided $61 million in 2000, and the Netherlands continued its support with $40 million (2004) Currency Currency: 1 Netherlands Antillean guilder, gulden, or florin (NAf.) = 100 cents Exchange rates: Netherlands Antillean guilders, | was formally dissolved in 2010. Overview Tourism, petroleum transshipment, and offshore finance were the mainstays of the economy, which was closely tied to the outside world. The islands enjoyed a high per capita income and a well-developed infrastructure as compared with other countries in the region at the time of the dissolution. Almost all consumer and capital goods were imported, with Venezuela, the United States, and Mexico being the major suppliers. Poor soils and inadequate water supplies hampered the development of agriculture. Statistics Gross Domestic product- $3.81 billion GDP: purchasing power parity - $3 600 million (3,6 G$) (2007 est.) GDP - real growth rate: 4,0% (2007 est.) GDP - per capita: purchasing power parity - $19 000 (2007 est.) GDP - composition by sector: agriculture: 1% industry: 15% services: 84% (2007 est.) Population below poverty line: NA% Household income or consumption by percentage share: lowest 10%: ± 1,5% highest 10%: ± 31% Inflation rate (consumer prices): 3,0% (2007) Labour force: 83 600 (2005) Labour force - by occupation: agriculture 1%, industry 20%, services 79% (2007 est.) Unemployment rate: 9% (2007 est.) Budget: revenues: $757,9 million expenditures: $949,5 million, including capital expenditures of $NA (2004 est.) Composition of the Economy Industries: tourism (Curaçao, Sint Maarten, and Bonaire), petroleum refining (Curaçao), petroleum transhipment facilities (Curaçao and Bonaire), light manufacturing (Curaçao) Industrial production |
and Bonaire can be received in Curaçao: Television Operating television stations include: Cable TV providers: Columbus Communications United Telecommunication Services (UTS) / TDS TV Distribution Systems N.V. Over-the-top media services: Cariflix Internet Country code: .cw (Curaçao West Indies) As of 2016, an estimated 138,750 people use the Internet in | include: Cable TV providers: Columbus Communications United Telecommunication Services (UTS) / TDS TV Distribution Systems N.V. Over-the-top media services: Cariflix Internet Country code: .cw (Curaçao West Indies) As of 2016, an estimated 138,750 people use the Internet in Curaçao, or 93.6% of the |
over) totaling 1,028,910 GT/ ships by type bulk 2, cargo 27, chemical tanker 2, combination ore/oil 3, container 16, liquified gas 4, multi-functional large load carrier 18, passenger 1, petroleum tanker 5, refrigerated cargo 26, roll-on/roll-off 6 (1999 est.) note a flag of convenience registry; includes ships of 2 countries: Belgium owns 9 ships, Germany 1 (1998 est.) Air Airports 5 (2005 est.) Airports - with paved runways total 5 over | of convenience registry; includes ships of 2 countries: Belgium owns 9 ships, Germany 1 (1998 est.) Air Airports 5 (2005 est.) Airports - with paved runways total 5 over 3,047 m 1 2,438 to 3,047 m 1 1,524 to 2,437 m 1 914 to 1,523 m 1 under 914 m 1 (2005 |
remote islets. The Chesterfield Islands are in the Coral Sea. French people, especially locals, call Grande Terre "" ("the pebble"). New Caledonia has a land area of divided into three provinces. The North and South Provinces are on the New Caledonian mainland, while the Loyalty Islands Province is a series of islands off the mainland. New Caledonia's population of 271,407 (October 2019 census) consists of a mix of the original inhabitants, Kanaks, who are the majority in the North Province and in the Loyalty Islands Province, and people of European descent (Caldoches and Metropolitan French), Polynesians (mostly Wallisians), and Southeast Asians, as well as a few people of Pied-Noir and North African descent, who are the majority in the rich South Province. The capital of the territory is Nouméa. History The earliest traces of human presence in New Caledonia date back to the period when the Lapita culture was influential in large parts of the Pacific, c. 1600-500 BCE or 1300-200 BCE. The Lapita were highly skilled navigators and agriculturists. The first settlements were concentrated around the coast, and date back to the period between c. 1100 BCE to 200 CE. British explorer James Cook was the first European to sight New Caledonia, on 4 September 1774, during his second voyage. He named it "New Caledonia", as the northeast of the island reminded him of Scotland. The west coast of Grande Terre was approached by the Comte de Lapérouse in 1788, shortly before his disappearance, and the Loyalty Islands were first visited between 1793 and 1796 when Mare, Lifou, Tiga, and Ouvea were mapped by English whaler William Raven. Raven encountered the island, then named Britania, and today known as Maré (Loyalty Is.), in November 1793. From 1796 until 1840, only a few sporadic contacts with the archipelago were recorded. About 50 American whalers have been recorded in the region (Grande Terre, Loyalty Is., Walpole and Hunter) between 1793 and 1887. Contacts with visiting ships became more frequent after 1840, because of their interest in sandalwood. As trade in sandalwood declined, it was replaced by a new business enterprise, "blackbirding", a euphemism for taking Melanesian or Western Pacific Islanders from New Caledonia, the Loyalty Islands, New Hebrides, New Guinea, and the Solomon Islands into slavery, indentured or forced labour in the sugarcane plantations in Fiji and Queensland by various methods of trickery and deception. Blackbirding was practised by both French and Australian traders, but in New Caledonia's case, the trade in the early decades of the twentieth century involved kidnapping children from the Loyalty Islands to the Grand Terre for forced labour in plantation agriculture. New Caledonia's primary experience with blackbirding revolved around a trade from the New Hebrides (now Vanuatu) to the Grand Terre for labour in plantation agriculture, mines, as well as guards over convicts and in some public works. In the early years of the trade, coercion was used to lure Melanesian islanders onto ships. In later years indenture systems were developed; however, when it came to the French slave trade, which took place between its Melanesian colonies of the New Hebrides and New Caledonia, very few regulations were implemented. This represented a departure from contemporary developments in Australia, since increased regulations were developed to mitigate the abuses of blackbirding and 'recruitment' strategies on the coastlines. The first missionaries from the London Missionary Society and the Marist Brothers arrived in the 1840s. In 1849, the crew of the American ship Cutter was killed and eaten by the Pouma clan. Cannibalism was widespread throughout New Caledonia. French colonization On 24 September 1853, under orders from Emperor Napoleon III, Admiral Febvrier Despointes took formal possession of New Caledonia. Captain Louis-Marie-François Tardy de Montravel founded Port-de-France (Nouméa) on 25 June 1854. A few dozen free settlers settled on the west coast in the following years. New Caledonia became a penal colony in 1864, and from the 1860s until the end of the transportations in 1897, France sent about 22,000 criminals and political prisoners to New Caledonia. The for 1888 indicates that 10,428 convicts, including 2,329 freed ones, were on the island as of 1 May 1888, by far the largest number of convicts detained in French overseas penitentiaries. The convicts included many Communards, arrested after the failed Paris Commune of 1871, including Henri de Rochefort and Louise Michel. Between 1873 and 1876, 4,200 political prisoners were "relegated" to New Caledonia. Only 40 of them settled in the colony; the rest returned to France after being granted amnesty in 1879 and 1880. In 1864, nickel was discovered on the banks of the Diahot River; with the establishment of the Société Le Nickel in 1876, mining began in earnest. To work the mines the French imported labourers from neighbouring islands and from the New Hebrides, and later from Japan, the Dutch East Indies, and French Indochina. The French government also attempted to encourage European immigration, without much success. The indigenous population, the Kanak people, were excluded from the French economy and from mining work, and ultimately confined to reservations. This sparked a violent reaction in 1878, when High Chief Ataï of La Foa managed to unite many of the central tribes and launched a guerrilla war that killed 200 Frenchmen and 1,000 Kanaks. A , with Protestant missionaries like Maurice Leenhardt functioning as witnesses to the events of this war. Leenhardt would pen a number of ethnographic works on the Kanak of New Caledonia. Noël of Tiamou led the 1917 rebellion, which resulted in a number of orphaned children, one of whom was taken into the care of Protestant missionary Alphonse Rouel. This child, Wenceslas Thi, would become the father of Jean-Marie Tjibaou (1936–1989). Europeans brought new diseases such as smallpox and measles, which caused the deaths of many natives. The Kanak population declined from around 60,000 in 1878 to 27,100 in 1921, and their numbers did not increase again until the 1930s. In June 1940, after the fall of France, the Conseil General of New Caledonia voted unanimously to support the Free French government, and in September the pro-Vichy governor was forced to leave for Indochina. In 1941, some 300 men from the territory volunteered for service overseas. They were joined, in April, by 300 men from French Polynesia ('the Tahitians'), plus a handful from the French districts of the New Hebrides: together they formed the Bataillon du Pacifique (BP). The Caledonians formed two of the companies, and the Polynesians the other two. In May 1941, they sailed to Australia and boarded the RMS Queen Elizabeth for the onward voyage to Africa. They joined the other Free French (FF) battalions in Qastina in August, before moving to the Western Desert with the 1st FF Brigade (1re BFL). There they were one of the four battalions who took part in the breakout after the Battle of Bir Hakeim in 1942. Their losses could not easily be replaced from the Pacific and they were therefore amalgamated with the Frenchmen of another battalion wearing the anchor of 'la Coloniale', the BIM, to form the: Bataillon de l'infanterie de marine et du Pacifique (BIMP). The combined battalion formed part of the Gaulliste 1re Division Motorisée d'Infanterie/Division de Marche d'Infanterie (DMI), alongside three divisions from the French North African forces, in the French Expeditionary Corps (CEF) during the Italian Campaign. They landed in Provence in 1944, when they were posted out and replaced by local French volunteers and résistants. Meanwhile, in March 1942, with the assistance of Australia, New Caledonia became an important Allied base, and the main South Pacific Fleet base of the United States Navy in the South Pacific moved to Nouméa in 1942–1943. The fleet that turned back the Japanese Navy in the Battle of the Coral Sea in May 1942 was based at Nouméa. American troops stationed on New Caledonia numbered as many as 50,000: matching the entire local population at the time. French overseas territory In 1946, New Caledonia became an overseas territory. By 1953, French citizenship had been granted to all New Caledonians, regardless of ethnicity. The European and Polynesian populations gradually increased in the years leading to the nickel boom of 1969–1972, and the indigenous Kanak Melanesians became a minority, though they were still the largest ethnic group. Between 1976 and 1988, conflicts between French government actions and the Kanak independence movement saw periods of serious violence and disorder. In 1983, a statute of "enlarged autonomy" for the territory proposed a five-year transition period and a referendum in 1989. In March 1984, the Kanak resistance, Front Indépendantiste, seized farms and the Kanak and Socialist National Liberation Front (FLNKS) formed a provisional government. In January 1985, the French Socialist government offered sovereignty to the Kanaks and legal protection for European settlers. The plan faltered as violence escalated. The government declared a state of emergency; however, regional elections went ahead, and the FLNKS won control of three out of four provinces. The centre-right government elected in France in March 1986 began eroding the arrangements established under the Socialists, redistributing lands mostly without consideration of native land claims, resulting in over two-thirds going to Europeans and less than a third to the Kanaks. By the end of 1987, roadblocks, gun battles and the destruction of property culminated in the Ouvéa cave hostage taking, a dramatic hostage crisis on the eve of the presidential elections in France. Pro-independence militants on Ouvéa killed four gendarmes and took 27 hostage. The military assaulted the cave to rescue the hostages. Nineteen Kanak hostage takers were killed and another three died in custody. 2 soldiers were killed during the assault. The Matignon Agreements, signed on 26 June 1988, ensured a decade of stability. The Nouméa Accord, signed 5 May 1998, set the groundwork for a 20-year transition that gradually transfers competences to the local government. Following the timeline set by the Nouméa Accord that stated a vote must take place by the end of 2018, the groundwork was laid for a referendum on full independence from France at a meeting chaired by the French Prime Minister Édouard Philippe on 2 November 2017, to be held by November 2018. Voter list eligibility was the subject of a long dispute, but the details were resolved. The referendum was held on 4 November 2018, with independence being rejected. Another referendum was held in October 2020, with voters once again choosing to remain a part of France. In the 2018 referendum, 56.7% of voters chose to remain in France. In the 2020 referendum, this percentage dropped with 53.4% of voters choosing to remain part of France. The Nouméa Accord permits one further referendum to be held, should at least a third of members of the Congress of New Caledonia request it. The third referendum was held on 12 December 2021. The referendum was boycotted by pro-independence forces who wanted a delay due to the COVID-19 pandemic and were angry at "stay" campaigning by the French government. This led to 96% of voters choosing to stay with France. Politics New Caledonia is a territory sui generis to which France has gradually transferred certain powers. As such its citizens have French nationality and vote for the president of France. They have the right to vote in elections to the European Parliament. It is governed by a 54-member Territorial Congress, a legislative body composed of members of three provincial assemblies. The French State is represented in the territory by a High Commissioner. At a national level, New Caledonia is represented in the French Parliament by two deputies and two senators. At the 2012 French presidential election, the voter turnout in New Caledonia was 61.19%. For 25 years, the party system in | of New Caledonia out of three live in Greater Nouméa. 78% were born in New Caledonia. The total fertility rate went from 2.2 children per woman in 2014 to 1.9 in 2019. Ethnic groups At the 2019 census, 41.2% of the population reported belonging to the Kanak community (up from 39.1% at the 2014 census) and 24.1% to the European (Caldoche and Zoreille) community (down from 27.2% at the 2014 census). Most of the people who self-identified as "Caledonian" are thought to be ethnically European. The other self-reported communities were Wallisians and Futunians (8.3% of the total population, up from 8.2% at the 2014 census), Indonesians who are from the Javanese ethnic group (1.4% of the total population, the same as in 2014), Tahitians (2.0% of the total population, down from 2.1% at the 2009 census), Ni-Vanuatu (0.9%, down from 1.0% at the 2014 census), Vietnamese (0.8%, down from 0.9% at the 2014 census), and other Asians (primarily ethnic Chinese; 0.4% of the total population, the same as in 2014). Finally 11.3% of the population reported belonging to multiple communities (mixed race) (up from 8.6% at the 2014 census), and 9.6% belonged to other communities (mainly "Caledonian"). The question on community belonging, which had been left out of the 2004 census, was reintroduced in 2009 under a new formulation, different from the 1996 census, allowing multiple choices (mixed race) and the possibility to clarify the choice "other". The Kanak people, part of the Melanesian group, are indigenous to New Caledonia. Their social organization is traditionally based on clans, which identify as either "land" or "sea" clans, depending on their original location and the occupation of their ancestors. According to the 2019 census, the Kanak constitute 95% of the population in the Loyalty Islands Province, 72% in the North Province and 29% in the South Province. The Kanak tend to be of lower socio-economic status than the Europeans and other settlers. Europeans first settled in New Caledonia when France established a penal colony on the archipelago. Once the prisoners had completed their sentences, they were given land to settle. According to the 2014 census, of the 73,199 Europeans in New Caledonia 30,484 were native-born, 36,975 were born in Metropolitan France, 488 were born in French Polynesia, 86 were born in Wallis and Futuna, and 5,166 were born abroad. The Europeans are divided into several groups: the Caldoches are usually defined as those born in New Caledonia who have ancestral ties that span back to the early French settlers. They often settled in the rural areas of the western coast of Grande Terre, where many continue to run large cattle properties. Distinct from the Caldoches are those who were born in New Caledonia from families that had settled more recently, and are called simply Caledonians. The Metropolitan French-born migrants who come to New Caledonia are called Métros or Zoreilles, indicating their origins in metropolitan France. There is also a community of about 2,000 pieds noirs, descended from European settlers in France's former North African colonies; some of them are prominent in anti-independence politics, including Pierre Maresca, a leader of the RPCR. A 2015 documentary by Al Jazeera English asserted that up to 10% of New Caledonia's population is descended from around 2,000 Arab-Berber people deported from French Algeria in the late 19th century to prisons on the island in reprisal for the Mokrani Revolt in 1871. After serving their sentences, they were released and given land to own and cultivate as part of colonisation efforts on the island. As the overwhelming majority of the Algerians imprisoned on New Caledonia were men, the community was continued through intermarriage with women of other ethnic groups, mainly French women from nearby women's prisons. Despite facing both assimilation into the Euro-French population and discrimination for their ethnic background, descendants of the deportees have succeeded in preserving a common identity as Algerians, including maintaining certain cultural practices (such as Arabic names) and in some cases Islamic religion. Some travel to Algeria as a rite of passage, though obtaining Algerian citizenship is often a difficult process. The largest population of Algerian-Caledonians lives in the commune of Bourail (particularly in the Nessadiou district, where there is an Islamic cultural centre and cemetery), with smaller communities in Nouméa, Koné, Pouembout, and Yaté. Languages The French language began to spread with the establishment of French settlements, and French is now spoken even in the most secluded villages. The level of fluency, however, varies significantly across the population as a whole, primarily due to the absence of universal access to public education before 1953, but also due to immigration and ethnic diversity. At the 2009 census, 97.3% of people aged 15 or older reported that they could speak, read and write French, whereas only 1.1% reported that they had no knowledge of French. Other significant language communities among immigrant populations are those of Wallisian and Javanese language speakers. The 28 Kanak languages spoken in New Caledonia are part of the Oceanic group of the Austronesian family. Kanak languages are taught from kindergarten (four languages are taught up to the bachelor's degree) and an academy is responsible for their promotion. The three most widely spoken indigenous languages are Drehu (spoken in Lifou), Nengone (spoken on Maré) and Paicî (northern part of Grande Terre). Others include Iaai (spoken on Ouvéa). At the 2009 census, 35.8% of people aged 15 or older reported that they could speak (but not necessarily read or write) one of the indigenous Melanesian languages, whereas 58.7% reported that they had no knowledge of any of them. Religion The predominant religion is Christianity; half of the population is Roman Catholic, including most of the Europeans, Uveans, and Vietnamese and half of the Melanesian and Polynesian minorities. Roman Catholicism was introduced by French colonists. The island also has numerous Protestant churches, of which the Free Evangelical Church and the Evangelical Church in New Caledonia and the Loyalty Islands have the largest number of adherents; their memberships are almost entirely Melanesian. Protestantism gained ground in the late 20th century and continues to expand. There are also numerous other Christian groups and more than 6,000 Muslims. (See Islam in New Caledonia and Baháʼí Faith in New Caledonia.) Nouméa is the seat of the Roman Catholic Archdiocese of Nouméa. Education Education in New Caledonia is based on the French curriculum and delivered by both French teachers and French–trained teachers. Under the terms of the 1998 Nouméa Accord, primary education is the responsibility of the three provinces. As of 2010, secondary education was in the process of being transferred to the provinces. The majority of schools are located in Nouméa but some are found in the islands and the north of New Caledonia. When students reach high school age, most are sent to Nouméa to continue their secondary education. Education is compulsory from the age of six years. New Caledonia's main tertiary education institution is the University of New Caledonia (Université de la Nouvelle-Calédonie), which was founded in 1993 and comes under the supervision of the Ministry of Higher Education, Research and Innovation. It is based in Nouméa and offers a range of vocational, Bachelor, MA, and PhD programmes and courses. The University of New Caledonia consists of three academic departments, one institute of technology, one PhD school, and one teacher's college. As of 2013, the university has approximately 3,000 students, 107 academics, and 95 administrative and library staff. Many New Caledonian students also pursue scholarships to study in metropolitan France. As part of the Nouméa Accord process, a Cadre Avenir provides scholarships for Kanak professionals to study in France. Economy New Caledonia has one of the largest economies in the South Pacific, with a GDP of US$9.44 billion in 2019. The nominal GDP per capita was US$34,780 (at market exchange rates) in 2019. It is lower than the nominal GDP per capita of Hawaii, Australia, New Zealand, and Guam, but higher than all other independent and non-sovereign countries and territories in Oceania, although there is significant inequality in income distribution, and long-standing structural imbalances between the economically dominant South Province and the less developed North Province and Loyalty Islands. The currency in use in New Caledonia is the CFP franc, as of May 2020, pegged to the euro at a rate of 119.3 CFP to 1.00 euros. It is issued by the Institut d’Émission d'Outre-Mer. Real GDP grew by an average of +3.3% per year in the first half of the 2010s, boosted by rising worldwide nickel prices and an increase in domestic demand due to rising employment, as well as strong business investments, but by only +0.2% per year in the second half of the 2010s, as the local nickel industry entered a period of crisis and the repeated independence referendums have generated economic uncertainty. In 2011, exports of goods and services from New Caledonia amounted to 2.11 billion US dollars, 75.6% of which were mineral products and alloys (mainly nickel ore and ferronickel). Imports of goods and services amounted to 5.22 billion US dollars. 22.1% of the imports of goods came from Metropolitan France and its overseas departments, 16.1% from other countries in the European Union, 14.6% from Singapore (essentially fuel), 9.6% from Australia, 4.5% from the United States, 4.2% from New Zealand, 2.0% from Japan, and 27.0% from other countries. The trade deficit in goods and services stood at 3.11 billion US dollars in 2011. Financial support from France is substantial, representing more than 15% of the GDP, and contributes to the health of the economy. Tourism is underdeveloped, with 100,000 visitors a year, compared to 400,000 in the Cook Islands and 200,000 in Vanuatu. Much of the land is unsuitable for agriculture, and food accounts for about 20% of imports. According to FAOSTAT, New Caledonia is a significant producer of: yams (33rd); taro (44th); plantains (50th); coconuts (52nd). The exclusive economic zone of New Caledonia covers . The construction sector accounts for roughly 12% of GDP, employing 9.9% of the salaried population in 2010. Manufacturing is largely confined to small-scale activities such as the transformation of foodstuffs, textiles and plastics. Nickel sector New Caledonian soils contain about 25% of the world's nickel resources. The late-2000s recession has gravely affected the nickel industry, as the sector faced a significant drop in nickel prices (−31.0% year-on-year in 2009) for the second consecutive year. The fall in prices has led a number of producers to reduce or stop altogether their activity, resulting in a reduction of the global supply of nickel by 6% compared to 2008. This context, combined with bad weather, has forced the operators in the sector to revise downwards their production target. Thus, the activity of mineral extraction has declined by 8% in volume year on year. The share of the nickel sector as a percentage of GDP fell from 8% in 2008 to 5% in 2009. A trend reversal and a recovery in demand have been recorded early in the second half of 2009, allowing a 2.0% increase in the local metal production. A March 2020 report stated that "New Caledonia is the world's fourth largest nickel producer, which has seen a 26% rally in prices in the past year". According to industry sources however, the Goro mine has never met its potential capacity to produce "60,000 tpy of nickel in the form of nickel oxide, due to design flaws and operational commissioning issues" In 2019, it produced slightly over a third of its annual capacity". In March 2021, Tesla agreed to a partnership with the Goro Mine, a "technical and industrial partnership to help with product and sustainability standards along with taking nickel for its battery production, according to the agreement", according to a BBC News report. The majority owner, Vale, said that the deal will be of long-term benefit in terms of jobs and the economy. Tesla is a heavy user of nickel for making the lithium-ion batteries and wanted to "secure its long-term supply". Also in March 2021, a part of Vale's nickel business was sold "to a consortium called Prony, which includes Swiss commodity trader |
limestone islands built on top of ancient collapsed volcanoes originating due to subduction at the Vanuatu trench. The Chesterfield Islands, to the northwest, are reef outcroppings of the oceanic plateau. The Matthew and Hunter Islands, at east, respectively, are volcanic islands that form the southern end of the arc of the New Hebrides. The Grande Terre is by far the largest of the islands, and the only mountainous island. It has an area of , and is elongated northwest–southeast, in length and wide. A mountain range runs the length of the island, with five peaks over . The highest point is Mont Panié at elevation. The total area of New Caledonia is , of those being land. A territorial dispute exists with regard to the uninhabited Matthew and Hunter Islands, which are claimed by both France (as part of New Caledonia) and Vanuatu. Zealandian origin The New Caledonian archipelago is a microcontinental island chain which originated as a fragment of Zealandia, a nearly submerged continent or microcontinent which was part of the southern supercontinent of Gondwana during the time of the dinosaurs. The Grande Terre group of New Caledonia, with Mont Panié at as its highest point, is the most elevated part of the Norfolk Ridge, a long and mostly underwater arm of the continent. While they were still one landmass, Zealandia and Australia combined broke away from Antarctica between 85 and 130 million years ago. Australia and Zealandia split apart 60–85 million years ago. Although biologists consider it contrary to the evidence of surviving Gondwanan lineages, geologists consider the logical possibility that Zealandia may have been completely submerged about 23 million years ago. While a continent like Australia consists of a large body of land surrounded by a fringe of continental shelf, Zealandia consists almost entirely of continental shelf, with the vast majority, some 93%, submerged beneath the Pacific Ocean. This viewpoint is not universal. Bernard Pelletier argues that Grande Terre was completely submerged for millions of years, and hence the origin of the flora may not be local in nature, but due to long distance-dispersal. Zealandia is in area, larger than Greenland or India, and almost half the size of Australia. It is unusually slender, stretching from New Caledonia in the north to beyond New Zealand's subantarctic islands in the south (from latitude 19° south to 56° south, analogous to ranging from Haiti to Hudson Bay or from Sudan to Sweden in the Northern Hemisphere). New Zealand is the greatest part of Zealandia above sea level, followed by New Caledonia. Given its continental origin as a fragment of Zealandia, unlike many of the islands of the Pacific such as the Hawaiian chain, New Caledonia is not of geographically recent volcanic provenance. Its separation from Australia at the end of the Cretaceous (65 mya) and from New Zealand in the mid-Miocene has led to a long period of evolution in near complete isolation. New Caledonia's natural heritage significantly comprises species whose ancestors were ancient and primitive flora and fauna present on New Caledonia when it broke away from Gondwana millions of years ago, not only species but entire genera and even families are unique to the island, and survive nowhere else. Since the age of the dinosaurs, as the island moved north due to the effects of continental drift, some geologists assert that it may have been fully submerged at various intervals. Botanists, however, argue that there must have been some areas that remained above sea level, serving as refugia for the descendants of the original flora that inhabited the island when it broke away from Gondwana. The isolation of New Caledonia was not absolute, however. New species came to New Caledonia while species of Gondwanan origin were able to penetrate further eastward into the Pacific Island region. Climate The climate of New Caledonia is tropical, modified by southeasterly trade winds. It is hot and humid. Natural hazards are posed in New Caledonia by cyclones, which occur most frequently between November and March. While rainfall in the neighboring Vanuatu islands averages two meters annually, from the north of New Caledonia to | center, and New Caledonia is a promontory ridge on the continent's northern edge.) New Caledonia itself drifted away from Australia 66 mya, and subsequently drifted in a north-easterly direction, reaching its present position about 50 mya. Given its long stability and isolation, New Caledonia serves as a unique island refugium—a sort of biological 'ark'—hosting a unique ecosystem and preserving Gondwanan plant and animal lineages no longer found elsewhere. Composition New Caledonia is made up of a main island, the Grande Terre, and several smaller islands, the Belep archipelago to the north of the Grande Terre, the Loyalty Islands to the east of the Grande Terre, the (Isle of Pines) to the south of the Grande Terre, the Chesterfield Islands and Bellona Reefs further to the west. Each of these four island groups has a different geological origin: The New Caledonia archipelago, which includes Grande Terre, Belep, and the Île des Pins was born as a series of folds of the earth's mantle between the Permian period (251–299 mya) and the Paleogene and Neogene periods (1.5–66 mya). This mantle obduction created large areas of peridotite and a bedrock rich in nickel. The Loyalty Islands, a hundred kilometers to the east, are coral and limestone islands built on top of ancient collapsed volcanoes originating due to subduction at the Vanuatu trench. The Chesterfield Islands, to the northwest, are reef outcroppings of the oceanic plateau. The Matthew and Hunter Islands, at east, respectively, are volcanic islands that form the southern end of the arc of the New Hebrides. The Grande Terre is by far the largest of the islands, and the only mountainous island. It has an area of , and is elongated northwest–southeast, in length and wide. A mountain range runs the length of the island, with five peaks over . The highest point is Mont Panié at elevation. The total area of New Caledonia is , of those being land. A territorial dispute exists with regard to the uninhabited Matthew and Hunter Islands, which are claimed by both France (as part of New Caledonia) and Vanuatu. Zealandian origin The New Caledonian archipelago is a microcontinental island chain which originated as a fragment of Zealandia, a nearly submerged continent or microcontinent which was part of the southern supercontinent of Gondwana during the time of the dinosaurs. The Grande Terre group of New Caledonia, with Mont Panié at as its highest point, is the most elevated part of the Norfolk Ridge, a long and mostly underwater arm of the continent. While they were still one landmass, Zealandia and Australia combined broke away from Antarctica between 85 and 130 million years ago. Australia and Zealandia split apart 60–85 million years ago. Although biologists consider it contrary to the evidence of surviving Gondwanan lineages, geologists consider the logical possibility that Zealandia may have been completely submerged about 23 million years ago. While a continent like Australia consists of a large body of land surrounded by a fringe of continental shelf, Zealandia consists almost entirely of continental shelf, with the vast majority, some 93%, submerged beneath the Pacific Ocean. This viewpoint is not universal. Bernard Pelletier argues that Grande Terre was completely submerged for millions of years, and hence the origin of the flora may not be local in nature, but due to long distance-dispersal. Zealandia is in area, larger than Greenland or India, and almost half the size of Australia. It is unusually slender, stretching from New Caledonia in the north to beyond New Zealand's subantarctic islands in the south (from latitude 19° south to 56° south, analogous to ranging from Haiti to Hudson Bay or from Sudan to Sweden in the Northern Hemisphere). New Zealand is the greatest part of Zealandia above sea level, followed by New Caledonia. Given its continental origin as a fragment of Zealandia, unlike many of the islands of the Pacific such as the Hawaiian chain, New Caledonia is not of geographically recent volcanic provenance. Its separation from Australia at the end of the Cretaceous (65 mya) and from New Zealand in the mid-Miocene has led to a long period of evolution in near complete isolation. New Caledonia's natural heritage significantly comprises species whose ancestors were ancient and primitive flora and fauna present on New Caledonia when it broke away from Gondwana millions of years ago, not only species but entire genera and even families are unique to the island, and survive nowhere else. Since the age of the dinosaurs, as the island moved north due to the effects of continental drift, some geologists assert that it may have been fully submerged at various intervals. Botanists, however, argue that there must have been some areas that remained above sea level, serving as refugia for the descendants of the original flora that inhabited the island when it broke away from Gondwana. The isolation of New Caledonia was not absolute, however. New species came to New Caledonia while species of Gondwanan origin were able to penetrate further eastward into the Pacific Island region. Climate The climate of New Caledonia is tropical, modified by southeasterly trade winds. It is hot and humid. Natural hazards are posed in New Caledonia by cyclones, which occur most frequently between November and March. While rainfall in the neighboring Vanuatu islands averages two meters annually, from the north of New Caledonia to the south the rain decreases to a little over . The mean annual temperature drops over the same interval from , and seasonality becomes more pronounced. The capital, Nouméa, located on a peninsula on the southwestern coast of the island normally has a dry season which increases in intensity from August until mid-December, ending suddenly with the coming of rain in January. |
population. Historical population Vital statistics Births and deaths Ethnic groups Ethnic Melanesians known as Kanak constituted 41.2% of the population in 2019, followed by Europeans with 24.1%. The Europeans are the largest ethnic group in the South Province, where they make up a plurality, while Kanak are the majority in the two other provinces. The remainder of the population are Wallis and Futunan (8.3%), Tahitian (2.0%), Indonesians (1.4%), Ni-Vanuatu (0.9%), Vietnamese, other Asian (0.4%), Mixed (11.3%), and belong to other ethnic groups (9.5%). An estimate of 15,000 Caledonians were of Algerian descent. CIA World Factbook demographic statistics The following demographic statistics are from the CIA World | live births Male: 5.9 deaths/1,000 live births Female: 4.1 deaths/1,000 live births Life expectancy at birth Total population: 78.4 years Male: 74.4 years Female: 82.5 years Total fertility rate 1.88 children born/woman Nationality noun: New Caledonian(s) adjective: New Caledonian Religions Roman Catholic 60% Protestant 30% Other 10% Ethnic groups Kanak 39.1% European 27.1% Wallisian and Futunian 8.2% Tahitian 2.1% Indonesian 1.4% Ni-Vanuatu 1% Vietnamese 0.9% Other 17.7% Unspecified 2.5% Languages French (official) 33 Melanesian-Polynesian dialects Literacy Definition: age 15 and over can read and write Total population: 96.9% Male: 97.3% Female: 96.5% See also New Caledonia Europeans in Oceania |
the referendum to be held by November 2018. Voter list eligibility had been a subject of a long dispute, but the details have were resolved at this meeting. In the 2018 referendum, voters narrowly chose to remain a part of France. Two further referendums were permitted, being held in 2020 and 2021. 2020 saw slimmer margins than in the 2018 referendum, with 46.74% in favor of independence, while the 2021 vote overwhelmingly rejected independence, with 96.49% against it. The current president of the government elected by the Congress is Thierry Santa, from the loyalist (i.e. anti-independence) The Rally political party. Executive branch |High Commissioner |Laurent Prévost | |5 August 2019 |- |President of the Government |Thierry Santa |The Rally |6 July 2019 |} The high commissioner is appointed by the French president on the advice of the French Ministry of Interior. The president of the government is elected by the members of the Territorial Congress. Legislative branch The Congress (Congrès) has 54 members, being the members of the three regional councils, all elected for a five-year term by proportional representation. Furthermore, there is a 16-member Kanak Customary Senate (two members from each of the eight customary aires). Political parties and elections Latest territorial election Parliamentarians French National Assembly Philippe Dunoyer (first constituency, Caledonia Together, CE) elected 2017 Philippe Gomès (second constituency, Caledonia Together, CE) elected 2012 French Senate Pierre Frogier (The Rally) elected 2011 Gérard Poadja (Caledonia Together, CE) elected 2017 Judicial branch Court of Appeal or Cour d'Appel; County Courts; Joint Commerce Tribunal Court; Children's Court Administrative divisions New Caledonia is divided into three provinces: Province des Îles, Province Nord, and Province Sud - which are further subdivided into 33 communes. International organization participation French-Pacific Banking Agreement International Confederation of Free Trade Unions Pacific Islands Forum (associate) The Pacific Community (SPC) United Nations Economic and Social Commission for Asia and the Pacific (associate) World | its choosing. Following the timeline set by the Nouméa Accord, the groundwork was laid for a Referendum on full independence from France at a meeting chaired by the French Prime Minister Édouard Philippe on 2 November 2017, with the referendum to be held by November 2018. Voter list eligibility had been a subject of a long dispute, but the details have were resolved at this meeting. In the 2018 referendum, voters narrowly chose to remain a part of France. Two further referendums were permitted, being held in 2020 and 2021. 2020 saw slimmer margins than in the 2018 referendum, with 46.74% in favor of independence, while the 2021 vote overwhelmingly rejected independence, with 96.49% against it. The current president of the government elected by the Congress is Thierry Santa, from the loyalist (i.e. anti-independence) The Rally political party. Executive branch |High Commissioner |Laurent Prévost | |5 August 2019 |- |President of the Government |Thierry Santa |The Rally |6 July 2019 |} The high commissioner is appointed by the French president on the advice of the French Ministry of Interior. The president of the government is elected by the members of the Territorial Congress. Legislative branch The Congress (Congrès) has 54 members, being the members of the three regional councils, all elected for a five-year term by proportional representation. Furthermore, there is a 16-member Kanak Customary Senate (two members from each of the eight customary aires). Political parties and elections Latest territorial election Parliamentarians French National Assembly Philippe Dunoyer (first constituency, Caledonia Together, CE) elected 2017 Philippe Gomès (second constituency, Caledonia Together, CE) elected 2012 French Senate Pierre Frogier (The Rally) elected 2011 Gérard Poadja (Caledonia Together, CE) elected 2017 Judicial branch Court of Appeal or Cour d'Appel; County Courts; Joint Commerce Tribunal Court; Children's Court Administrative divisions New Caledonia is divided into three provinces: Province des Îles, Province Nord, and Province Sud - which are further subdivided into 33 communes. International organization participation French-Pacific Banking Agreement International Confederation of Free Trade Unions Pacific Islands Forum (associate) The Pacific Community (SPC) United Nations Economic and Social Commission for Asia and the Pacific (associate) World Federation of Trade Unions World Meteorological Organization. See also Ouvéa cave hostage taking References External |
per capita was 36,376 US dollars in 2007 (at market exchange rates, not at PPP), lower than in Australia and Hawaii, but higher than in New Zealand. In 2007, exports from New Caledonia amounted to 2.11 billion US dollars, 96.3% of which were mineral products and alloys (essentially nickel ore and ferronickel). Imports amounted to 2.88 billion US dollars. 26.6% of imports came from Metropolitan France, 16.1% from other European countries, 13.6% from Singapore (essentially fuel), 10.7% from Australia, 4.0% from New Zealand, 3.2% from the United States, 3.0% from China, 3.0% from Japan, and 22.7% from other countries. Tourism As of 2007, about 200 Japanese couples travel to New Caledonia each year for their wedding and honeymoon. Oceania Flash reported in 2007 that one company planned to build a new wedding chapel to accommodate Japanese weddings to supplement the Le Meridien Resort in Nouméa. New Caledonia is a popular destination for groups of Australian high school students who are studying French. See also Economy of France in: French Guiana, French Polynesia, Guadeloupe, Martinique, Mayotte, New Caledonia, Réunion, Saint Barthélemy, Saint Martin, Saint Pierre and Miquelon, Wallis and Futuna Taxation in France | nickel, due to the ongoing global financial crisis. Only a negligible amount of the land is suitable for cultivation, and food accounts for about 20% of imports. In addition to nickel, the substantial financial support from France and tourism are keys to the health of the economy. In the 2000s, large additions were made to nickel mining capacity. The Goro Nickel Plant is expected to be one of the largest nickel producing plants on Earth. When full-scale production begins in 2013 this plant will produce an estimated 20% of the global nickel supply. However, the need to respond to environmental concerns over the country's globally recognized ecological heritage, may increasingly need to be |
shortwave 0 (2009) Radios: 107,000 (1997) Television broadcast stations: 6 (plus 25 low-power repeaters) (1997) Televisions: 52,000 (1997) Internet Internet Service Providers (ISPs): 1 (1999) Country code (Top level domain): NC See also Telephone numbers in New Caledonia References Communications | lines in use: 53,300 (2004) (up from 44,000 in 1995) Telephones - mobile cellular: 116,400 (2004) (up from 825 in 1995) Telephone system: domestic: NA international: satellite earth |
the location of caves in inland New Zealand. The main regions of karst topography are the Waitomo District and Takaka Hill in the Tasman district. Other notable locations are on the West Coast (Punakaiki), the Hawkes Bay and Fiordland. Lava caves (lava tubes) usually form in pāhoehoe lava flows, which are less viscus and typical formed from basalt. When an eruption occurs the outer layer of the lava flow hardens, while the interior remain liquid. The liquid lava flows out as it is insulated by the hardened crust above. These caves are found where there are relatively recent basaltic volcanoes in New Zealand, such as the Auckland Volcanic Field particularly on Rangitoto, Mount Eden and Matukutūruru. The distribution of sea caves is more sporadic, with their location and orientation being controlled by weakness in the underlying rock. As cave systems take many thousands of years to develop they can now be isolated from the water that formed them, whether through change in sea level or groundwater flow. If as a cave grows it breaks through to the surface somewhere else it becomes a natural arch, like those near Karamea (Oparara Arches). Rivers and lakes The proportion of New Zealand's area (excluding estuaries) covered by rivers, lakes and ponds, based on figures from the New Zealand Land Cover Database, is (357526 + 81936) / (26821559 – 92499–26033 – 19216) = 1.6%. If estuarine open water, mangroves, and herbaceous saline vegetation are included, the figure is 2.2%. The mountainous areas of the North Island are cut by many rivers, many of which are swift and unnavigable. The east of the South Island is marked by wide, braided rivers such as the Wairau, Waimakariri and Rangitātā; formed from glaciers, they fan out into many strands on gravel plains. The Waikato, flowing through the North Island, is the New Zealand's longest river, with a length of . New Zealand's rivers feature hundreds of waterfalls; the most visited set of waterfalls are the Huka Falls that drain Lake Taupō. Lake Taupō, located near the centre of the North Island, is the largest lake by surface area in the country. It lies in a caldera created by the Oruanui eruption, the largest eruption in the world in the past 70,000 years. There are 3,820 lakes with a surface area larger than one hectare. Many lakes have been used as reservoirs for hydroelectric projects. Coastal wetlands Wetlands support the greatest concentration of wildlife out of any other habitat. New Zealand has six sites covering almost that are included in the List of Wetlands of International Importance (Ramsar sites), including the Whangamarino Wetland. A recent global remote sensing analysis suggested that there were of tidal flats in New Zealand, making it the 29th ranked country in terms of tidal flat area. Climate The main geographic factors that influence New Zealand's climate are the temperate latitude, with prevailing westerly winds; the oceanic environment; and the mountains, especially the Southern Alps. The climate is mostly temperate with mean temperatures ranging from in the South Island to in the North Island. January and February are the warmest months, July the coldest. New Zealand does not have a large temperature range, apart from central Otago, but the weather can change rapidly and unexpectedly. Near subtropical conditions are experienced in Northland. Most settled, lowland areas of the country have between of rainfall, with the most rain along the west coast of the South Island and the least on the east coast of the South Island and interior basins, predominantly on the Canterbury Plains and the Central Otago Basin (about ). Christchurch is the driest city, receiving about of rain PA, while Hamilton is the wettest, receiving more than twice that amount at PA, followed closely by Auckland. The wettest area by far is the rugged Fiordland region, in the south-west of the South Island, which has between of rain PA, with up to 15,000 mm in isolated valleys, amongst the highest recorded rainfalls in the world. The UV index can be very high and extreme in the hottest times of the year in the north of the North Island. This is partly due to the country's relatively little air pollution compared to many other countries and the high sunshine hours. New Zealand has very high sunshine hours with most areas receiving over 2000 hours per year. The sunniest areas are Nelson/Marlborough and the Bay of Plenty with 2,400 hours per year. The table below lists climate normals for the warmest and coldest months in New Zealand's six largest cities. North Island cities are generally warmest in February. South Island cities are warmest in January. Human geography Political geography New Zealand has no land borders. However, the Ross Dependency, its claim in Antarctica, notionally borders the Australian Antarctic Territory to the west and unclaimed territory to the east. Most other countries do not recognise territorial claims in Antarctica. New Zealand proper is divided administratively into sixteen regions: seven in the South Island and nine in the North. They have a physical geographical link with regional boundaries being based largely on drainage basins. Among the regions, eleven are administered by regional authorities (top tier of local government), while five are unitary authorities that combine the functions of regional authorities and those of territorial authorities (second tier). Regional authorities are primarily responsible for environmental resource management, land management, regional transport, and biosecurity and pest management. Territorial authorities administer local roading and reserves, waste management, building consents, the land use and subdivision aspects of resource management, and other local matters. The Chatham Islands is not a region, although its council operates as a region under the Resource Management Act. There are a number of outlying islands that are not included within regional boundaries. The Kermadecs and the Subantarctic Islands are inhabited only by a small number of Department of Conservation staff. Population geography The South Island contains a little under one-quarter of the population. Over three-quarters of New Zealand's population live in the North Island, with half living north of Lake Rotorua, and one-third of the total population living in the Auckland Region. Auckland is also the fastest growing region, accounting for 51% of New Zealand's total population growth (in the two decades up to 2016). The majority of the indigenous Māori people live in the North Island (87%), although a little under a quarter (24%) live in Auckland. New Zealand is a predominantly urban country, with % of the population living in an urban area. About % of the population live in the 20 main urban areas (population of 30,000 or more) and % live in the four largest cities of Auckland, Christchurch, Wellington, and Hamilton. (Other major urban areas include Tauranga, Dunedin, and Palmerston North.) New Zealand's population density of around inhabitants per square kilometres (or per ) is among the lowest in the world. New Zealand's peoples have been defined by their immigrant origin, the ongoing process of adaptation to a new land, being changed and changing those who came before. This process has led to a distinct distribution of culture across New Zealand. Here language and religion are used as markers for the far richer concept of culture. These metrics unfortunately exclude the political rural-urban divide and also the full effects of the Christchurch earthquakes on New Zealand's cultural distribution. New Zealand's most widely spoken language is English (89.8%), however, language, dialect and accent varies spatially both within and between ethnic groups. The Māori language (3.5%) is spoken more commonly in areas with large Māori populations (Gisborne, Bay of Plenty and Northland). There are many sub dialects of Māori, the most pronounced division being between the northern and southern tribes. While migration (typically from north to south) was constant throughout the 16–18th centuries, the south maintained a distinct culture largely due to lack of cultivation possible at that latitude. English is spoken with regional accents relating to the origin of immigrants; for example Scottish and English 19th century immigration in Southland and Canterbury respectively. This has also occurred with more recent immigration, with a wide variety of accents being common in larger cities where immigrant groups have preferentially settled. These immigrant groups change location with time and accents fade over generations. A wide variety of other languages make up the remaining approximately 6 percent of New Zealanders—with Samoan, Hindi, French and various Chinese dialects being the most common. These minority foreign languages are concentrated in the main cities, particularly Auckland where recent immigration groups have settled. Agricultural geography A relatively small proportion of New Zealand's land is arable (1.76 percent), and permanent crops cover 0.27 percent of the land. of the land is irrigated. As the world's largest exporter of sheep, New Zealand's agricultural industry focuses primarily on pastoral farming, particularly dairy and beef, as well as lambs. Dairy, specifically, is the top export. In addition to pastoral farming, fisherman harvest mussels, oysters and salmon, and horticulture farmers grow kiwifruit, as well as peaches, nectarines, etc. New Zealand's distance from world markets and spatial variation in rainfall, elevation and soil quality have defined the geography of its agriculture industry. As of 2007, almost 55 percent of New Zealand's total land area was being used for farming, which is standard compared to most developed countries. Three-fourths of it was pastoral land using for raising sheep, beef, deer, etc. The amount of farmland has decreased since 2002. New Zealand's isolated location has simultaneously lead to fewer pests and an agriculture industry with a greater susceptibility to introduced diseases and pests. A major concern for New Zealand farmers is the rapidly growing wild rabbit population. Wild rabbits have been an agricultural since their introduction to the country in the 1930s. They cause significant damage to farm lands: eating the grass, crops, and causing soil degradation. Many farmers are worried about their livelihoods and the effects that the rabbits will have on food supply and trade, as their numbers are quickly growing out of control. An illegal rabbit-killing virus called the rabbit haemorrhagic disease virus (RHDV) was released in 1997 by a group of vigilante farmers, and was very effective initially. After twenty years, however, the rabbits became immune to it. A new strain of the | Subantarctic Islands are inhabited only by a small number of Department of Conservation staff. Population geography The South Island contains a little under one-quarter of the population. Over three-quarters of New Zealand's population live in the North Island, with half living north of Lake Rotorua, and one-third of the total population living in the Auckland Region. Auckland is also the fastest growing region, accounting for 51% of New Zealand's total population growth (in the two decades up to 2016). The majority of the indigenous Māori people live in the North Island (87%), although a little under a quarter (24%) live in Auckland. New Zealand is a predominantly urban country, with % of the population living in an urban area. About % of the population live in the 20 main urban areas (population of 30,000 or more) and % live in the four largest cities of Auckland, Christchurch, Wellington, and Hamilton. (Other major urban areas include Tauranga, Dunedin, and Palmerston North.) New Zealand's population density of around inhabitants per square kilometres (or per ) is among the lowest in the world. New Zealand's peoples have been defined by their immigrant origin, the ongoing process of adaptation to a new land, being changed and changing those who came before. This process has led to a distinct distribution of culture across New Zealand. Here language and religion are used as markers for the far richer concept of culture. These metrics unfortunately exclude the political rural-urban divide and also the full effects of the Christchurch earthquakes on New Zealand's cultural distribution. New Zealand's most widely spoken language is English (89.8%), however, language, dialect and accent varies spatially both within and between ethnic groups. The Māori language (3.5%) is spoken more commonly in areas with large Māori populations (Gisborne, Bay of Plenty and Northland). There are many sub dialects of Māori, the most pronounced division being between the northern and southern tribes. While migration (typically from north to south) was constant throughout the 16–18th centuries, the south maintained a distinct culture largely due to lack of cultivation possible at that latitude. English is spoken with regional accents relating to the origin of immigrants; for example Scottish and English 19th century immigration in Southland and Canterbury respectively. This has also occurred with more recent immigration, with a wide variety of accents being common in larger cities where immigrant groups have preferentially settled. These immigrant groups change location with time and accents fade over generations. A wide variety of other languages make up the remaining approximately 6 percent of New Zealanders—with Samoan, Hindi, French and various Chinese dialects being the most common. These minority foreign languages are concentrated in the main cities, particularly Auckland where recent immigration groups have settled. Agricultural geography A relatively small proportion of New Zealand's land is arable (1.76 percent), and permanent crops cover 0.27 percent of the land. of the land is irrigated. As the world's largest exporter of sheep, New Zealand's agricultural industry focuses primarily on pastoral farming, particularly dairy and beef, as well as lambs. Dairy, specifically, is the top export. In addition to pastoral farming, fisherman harvest mussels, oysters and salmon, and horticulture farmers grow kiwifruit, as well as peaches, nectarines, etc. New Zealand's distance from world markets and spatial variation in rainfall, elevation and soil quality have defined the geography of its agriculture industry. As of 2007, almost 55 percent of New Zealand's total land area was being used for farming, which is standard compared to most developed countries. Three-fourths of it was pastoral land using for raising sheep, beef, deer, etc. The amount of farmland has decreased since 2002. New Zealand's isolated location has simultaneously lead to fewer pests and an agriculture industry with a greater susceptibility to introduced diseases and pests. A major concern for New Zealand farmers is the rapidly growing wild rabbit population. Wild rabbits have been an agricultural since their introduction to the country in the 1930s. They cause significant damage to farm lands: eating the grass, crops, and causing soil degradation. Many farmers are worried about their livelihoods and the effects that the rabbits will have on food supply and trade, as their numbers are quickly growing out of control. An illegal rabbit-killing virus called the rabbit haemorrhagic disease virus (RHDV) was released in 1997 by a group of vigilante farmers, and was very effective initially. After twenty years, however, the rabbits became immune to it. A new strain of the virus was released in March 2018, a Korean form of the strain called the K5 virus, or RHDV1-K5. This virus was introduced with the goal of exterminating 40 percent of the rabbit population. The new virus works much faster than the last one, expected to kill rabbits within two to four days of exposure. The virus has become a subject of debate among animal rights activists, due to the inhumane manner in which it kills the rabbits. However, farmers unanimously seem to be very grateful for the release of the virus. Almost half of New Zealand's climate change emissions are generated by greenhouse gases, mainly methane and nitrous oxide, which come from farming and agriculture. Organisms that grow inside of grazing animals' stomachs turn New Zealand's grass into methane. The increase of carbon dioxide in the air helps the plants to grow faster, but the long-term effects of climate change threaten farmers with the likelihood of more frequent and severe floods and droughts. Growers of kiwifruit, a major export in the horticulture industry of New Zealand, have experienced difficulties as a result of climate change. In the 2010s, warm winters did not provide the adequate cool temperatures needed for the flowering of kiwifruit, and this resulted in a reduction of the yield sizes. Droughts have also decreased apple production by causing sunburns and a lack of water available for irrigation. In contrast, the dairy industry has not been affected, and has actually adjusted well to the effects of climate change. Natural hazards Flooding is the most regular natural hazard. New Zealand is swept by weather systems that bring heavy rain; settlements are usually close to hill-country areas which experience much higher rainfall than the lowlands due to the orographic effect. Mountain streams which feed the major rivers rise rapidly and frequently break their banks covering farms with water and silt. Close monitoring, weather forecasting, stopbanks, dams, and reafforestation programmes in hill country have ameliorated the worst effects. New Zealand experiences around 14,000 earthquakes a year, some in excess of magnitude 7 (M7). Since the 2010, several large (M7, M6.3, M6.4, M6.2) and shallow (all <7 km) earthquakes have occurred immediately beneath Christchurch. These have resulted in 185 deaths, widespread destruction of buildings and significant liquefaction. These earthquakes are releasing distributed stress in the Pacific plate from the ongoing collision with the Indo-Australian plate to the west and north of the city. Volcanic activity is most common on the central North Island Volcanic Plateau. Tsunamis affecting New Zealand are associated with the Pacific Ring of Fire. Droughts are not regular and occur mainly in Otago and the Canterbury Plains and less frequently |
English is mostly non-rhotic and sounds similar to Australian English, with a common exception being the centralisation of the short i. The Māori language has undergone a process of revitalisation and is spoken by 4 percent of the population. New Zealand has an adult literacy rate of 99 percent and over half of the population aged 15–29 hold a tertiary qualification. In the adult population 14.2 percent have a bachelor's degree or higher, 30.4 percent have some form of secondary qualification as their highest qualification and 22.4 percent have no formal qualification. As at the 2018 census, 37 percent of the population identify as Christians, with Hinduism and Buddhism being the largest minority religions; almost half of the population (48.5 percent) is irreligious. Farming is a major occupation in New Zealand, although more people are employed as sales assistants. Most New Zealanders earn wage or salary income, with a median personal income in 2013 of NZ$28,500. Terminology While the demonym for a New Zealand citizen is New Zealander, the informal "Kiwi" is commonly used both internationally and by locals. The name derives from the kiwi, a native flightless bird, which is the national symbol of New Zealand. The Māori loanword "Pākehā" usually refers to New Zealanders of European descent, although some reject this appellation, and some Māori use it to refer to all non-Polynesian New Zealanders. Most people born in New Zealand or one of the realm's external territories (Tokelau, the Ross Dependency, the Cook Islands and Niue) before 2006 are New Zealand citizens. Further conditions apply for those born from 2006 onwards. Population The 2018 census enumerated a resident population of 4,699,755 – a 10.8 percent increase over the population recorded in the 2013 census. As of , the total population has risen to an (estimated by extrapolation). The population is increasing at a rate of 1.4–2.0 percent per year. In May 2020, Statistics New Zealand reported that New Zealand's population had climbed above 5 million people in March 2020; in September 2020, this was revised six months earlier to September 2019 when population estimates were rebased to the 2018 census. The median child birthing age was 30 and the total fertility rate is 2.1 births per woman in 2010. In Māori populations the median age is 26 and fertility rate 2.8. In 2010 the age-standardised mortality rate was 3.8 deaths per 1000 (down from 4.8 in 2000) and the infant mortality rate for the total population was 5.1 deaths per 1000 live births. The life expectancy of a New Zealand child born in 2014-16 was 83.4 years for females, and 79.9 years for males, which is among the highest in the world. Life expectancy at birth is forecast to increase from 80 years to 85 years in 2050 and infant mortality is expected to decline. In 2050 the median age is forecast to rise from 36 years to 43 years and the percentage of people 60 years of age and older rising from 18 percent to 29 percent. During early migration in 1858, New Zealand had 131 males for every 100 females, but following changes in migration patterns and the modern longevity advantage of women, females came to outnumber males in 1971. As of 2012 there are 0.99 males per female, with males dominating under 15 years and females dominating in the 65 years or older range. Vital statistics Population density New Zealand's population density is relatively low, at The vast majority of the population live on the main North and South Islands, with New Zealand's major inhabited smaller islands being Waiheke Island (), the Chatham and Pitt Islands (), and Stewart Island (381). Over three-quarters of the population live in the North Island ( percent), with one-third of the total population living in the Auckland Region. Most Māori live in the North Island (86.0 percent), although less than a quarter (23.8 percent) live in Auckland. New Zealand is a predominantly urban country, with percent of the population living in an urban area. About percent of the population live in the 20 main urban areas (population of 30,000 or more) and percent live in the four largest cities of Auckland, Christchurch, Wellington, and Hamilton. Approximately 14 percent of the population live in four different categories of rural areas as defined by Statistics New Zealand. About 18 percent of the rural population live in areas that have a high urban influence (roughly 12.9 people per square kilometre), many working in the main urban area. Rural areas with moderate urban influence and a population density of about 6.5 people per square kilometre account for 26 percent of the rural population. Areas with low urban influence where the majority of the residents work in the rural area house approximately 42 percent of the rural population. Remote rural areas with a density of less than 1 person per square kilometre account for about 14 percent of the rural population. Before local government reforms in the late 1980s, a borough council with more than 20,000 people could be proclaimed a city. The boundaries of councils tended to follow the edge of the built-up area, so there was little difference between the urban area and the local government area. In 1989, all councils were consolidated into regional councils (top tier) and territorial authorities (second tier) which cover a much wider area and population than the old city councils. Today a territorial authority must have a predominantly urban population of at least 50,000 before it can be officially recognised as a city. The 20 largest urban areas are listed below: Migration East Polynesians were the first people to reach New Zealand about 1280, followed by the early European explorers, notably James Cook in 1769 who explored New Zealand three times and mapped the coastline. Following the Treaty of Waitangi in 1840 when | 5 million people in March 2020; in September 2020, this was revised six months earlier to September 2019 when population estimates were rebased to the 2018 census. The median child birthing age was 30 and the total fertility rate is 2.1 births per woman in 2010. In Māori populations the median age is 26 and fertility rate 2.8. In 2010 the age-standardised mortality rate was 3.8 deaths per 1000 (down from 4.8 in 2000) and the infant mortality rate for the total population was 5.1 deaths per 1000 live births. The life expectancy of a New Zealand child born in 2014-16 was 83.4 years for females, and 79.9 years for males, which is among the highest in the world. Life expectancy at birth is forecast to increase from 80 years to 85 years in 2050 and infant mortality is expected to decline. In 2050 the median age is forecast to rise from 36 years to 43 years and the percentage of people 60 years of age and older rising from 18 percent to 29 percent. During early migration in 1858, New Zealand had 131 males for every 100 females, but following changes in migration patterns and the modern longevity advantage of women, females came to outnumber males in 1971. As of 2012 there are 0.99 males per female, with males dominating under 15 years and females dominating in the 65 years or older range. Vital statistics Population density New Zealand's population density is relatively low, at The vast majority of the population live on the main North and South Islands, with New Zealand's major inhabited smaller islands being Waiheke Island (), the Chatham and Pitt Islands (), and Stewart Island (381). Over three-quarters of the population live in the North Island ( percent), with one-third of the total population living in the Auckland Region. Most Māori live in the North Island (86.0 percent), although less than a quarter (23.8 percent) live in Auckland. New Zealand is a predominantly urban country, with percent of the population living in an urban area. About percent of the population live in the 20 main urban areas (population of 30,000 or more) and percent live in the four largest cities of Auckland, Christchurch, Wellington, and Hamilton. Approximately 14 percent of the population live in four different categories of rural areas as defined by Statistics New Zealand. About 18 percent of the rural population live in areas that have a high urban influence (roughly 12.9 people per square kilometre), many working in the main urban area. Rural areas with moderate urban influence and a population density of about 6.5 people per square kilometre account for 26 percent of the rural population. Areas with low urban influence where the majority of the residents work in the rural area house approximately 42 percent of the rural population. Remote rural areas with a density of less than 1 person per square kilometre account for about 14 percent of the rural population. Before local government reforms in the late 1980s, a borough council with more than 20,000 people could be proclaimed a city. The boundaries of councils tended to follow the edge of the built-up area, so there was little difference between the urban area and the local government area. In 1989, all councils were consolidated into regional councils (top tier) and territorial authorities (second tier) which cover a much wider area and population than the old city councils. Today a territorial authority must have a predominantly urban population of at least 50,000 before it can be officially recognised as a city. The 20 largest urban areas are listed below: Migration East Polynesians were the first people to reach New Zealand about 1280, followed by the early European explorers, notably James Cook in 1769 who explored New Zealand three times and mapped the coastline. Following the Treaty of Waitangi in 1840 when the country became a British colony, immigrants were predominantly from Britain, Ireland and Australia. Due to restrictive policies, limitations were placed on non-European immigrants. During the gold rush period (1858–1880s) large number of young men came from California and Victoria to New Zealand goldfields. Apart from British, there were Irish, Germans, Scandinavians, Italians and many Chinese. The Chinese were sent special invitations by the Otago Chamber of Commerce in 1866. By 1873 they made up 40 percent of the diggers in Otago and 25 percent of the diggers in Westland. From 1900 there was also significant Dutch, Dalmatian, and Italian immigration together with indirect European immigration through Australia, North America, South America and South Africa. Following the Great Depression policies were relaxed and migrant diversity increased. In 2008–09, a target of 45,000 migrants was set by the New Zealand Immigration Service (plus a 5,000 tolerance). At the 2018 census, 27.4 percent of people counted were not born in New Zealand, up from 25.2 percent in 2013. In 2018, over half (50.7 percent) of New Zealand's overseas-born population lived in the Auckland Region, including 70 percent of the country's Pacific Island-born population, 61.5 percent of its Asian-born population, and 52 percent of its Middle Eastern and African- born population. In the late 2000s, Asia overtook the British Isles as the largest source of overseas migrants; in 2013 around 32 percent of overseas-born New Zealand residents were born in Asia (mainly China, India, the Philippines and South Korea) compared to 26 percent born in the UK and Ireland. The number of fee-paying international students increased sharply in the late 1990s, with more than 20,000 studying in public tertiary institutions in 2002. To be eligible for entry under the skilled migrant plan applicants are assessed by an approved doctor for good health, provide a police certificate to prove good character and speak sufficient English. Migrants working in some occupations (mainly health) must be registered with the appropriate profession body before they can work within that area. Skilled migrants are assessed by Immigration New Zealand and applicants that they believe will contribute are issued with a residential visa, while those with potential are issued with a work to resident visa. Under the work to residency process applicants are given a temporary work permit for two years and are then eligible to apply for residency. Applicants with a job offer from an accredited New Zealand employer, cultural or sporting talent, looking for work where there has been a long-term skill shortage or to establish a business can apply for work to residency. While most New Zealanders live in New Zealand, there is also a significant diaspora abroad, estimated as of 2001 at over 460,000 or 14 percent of the international total of New Zealand-born. Of these, 360,000, over three-quarters of the New Zealand-born population residing outside of New Zealand, live in Australia. Other communities of New Zealanders abroad are concentrated in other English-speaking countries, specifically the United Kingdom, the United States and Canada, with smaller numbers located elsewhere. Nearly one quarter of New Zealand's highly skilled workers live overseas, mostly in Australia and |
the Pacific Community, Asia-Pacific Economic Cooperation, the East Asia Summit, and the ASEAN Regional Forum. It is a member of the Commonwealth of Nations, Organisation for Economic Co-operation and Development (OECD), and a founding member of the United Nations (UN). New Zealand is party to a number of free-trade agreements, most prominently Closer Economic Relations with Australia and the New Zealand–China Free Trade Agreement. Historically New Zealand aligned itself strongly with the United Kingdom and had few bilateral relations with other countries. In the later 20th century, relationships in the Asia-Pacific region became more important. New Zealand has also traditionally worked closely with Australia, whose foreign policy followed a similar historical trend. In turn, many Pacific Islands (such as Samoa) have looked to New Zealand's lead. A large proportion of New Zealand's foreign aid goes to these countries and many Pacific people migrate to New Zealand for employment. Despite the 1986 rupture in the ANZUS military alliance (as a result of New Zealand's nuclear-free policy), New Zealand has maintained good working relations with the United States and Australia on a broad array of international issues. Political culture Political change in New Zealand has been very gradual and pragmatic, rather than revolutionary. The nation's approach to governance has emphasised social welfare, and multiculturalism, which is based on immigration, social integration, and suppression of far-right politics, that has wide public and political support. New Zealand is regarded as one of the most honest countries in the world, and it was ranked first in the world in 2017 for lowest perceived level of corruption by the organisation Transparency International. Democracy and rule of law are founding political principles in New Zealand. Early Pākehā settlers believed that traditional British legal principles (including individual title to land) would be upheld in New Zealand. The nation's history, such as the legacy of the British colonial rule evidenced in the Westminster system, continues to have an impact on political culture. , New Zealand is identified as a "full democracy" in the Economist Intelligence Unit's Democracy Index. The country rates highly for civic participation in the political process, with 80% voter turnout during recent elections, compared with the average of 68%. Since the 1970s, New Zealand has shown a more socially liberal outlook. Beginning with the decriminalisation of homosexuality in 1986, successive governments have progressively increased the protection of LGBT rights, culminating in the legalisation of same-sex marriage in 2013. In 2020, the Abortion Legislation Act, that fully decriminalised abortion, was supported by members from all parties in Parliament. The idea of serving as a moral example to the world has been an important element of New Zealand national identity. The anti-apartheid movement in the 1970s and 1980s, protests against French nuclear testing at Moruroa atoll in the 1970s, and popular support for New Zealand's anti-nuclear policy in the 1980s are manifestations of this. From the 1990s New Zealand's anti-nuclear position has become a key element of government policy (irrespective of party) and of the country's "distinctive political identity". History Prior to New Zealand becoming a British colony in 1840, politics in New Zealand was dominated by Māori chiefs as leaders of hapu and iwi, utilising Māori customs as a political system. Colonial politics After the 1840 Treaty of Waitangi, a colonial governor and his small staff acted on behalf of the British Government based on the British political system. Whereas Māori systems had dominated prior to 1840, governors attempting to introduce British systems met with mixed success in Māori communities. More isolated Māori were little influenced by the Government. Most influences were felt in and around Russell, the first capital, and Auckland, the second capital. The first voting rights in New Zealand were legislated in 1852 as the New Zealand Constitution Act for the 1853 elections and reflected contemporary British practice. The electoral franchise was limited to property-owning male British subjects over 21 years old. The property qualification was relatively liberal in New Zealand compared to Britain, such that by the late 1850s 75% of adult New Zealand European males were eligible to vote compared to 20% in England and 12% in Scotland. Around 100 Māori chiefs voted in the 1853 election. During the 1850s provincial-based government was the norm. Provincial councils were abolished in 1876. Politics was initially dominated by conservative and wealthy "wool lords" who owned multiple sheep farms, mainly in Canterbury. During the gold rush era starting 1858 suffrage was extended to all British gold miners who owned a 1-pound mining license. The conservatives had been influenced by the militant action of gold miners in Victoria at Eureka. Many gold miners had moved to the New Zealand fields bringing their radical ideas. The extended franchise was modelled on the Victorian system. In 1863 the mining franchise was extended to goldfield business owners. By 1873 of the 41,500 registered voters 47% were gold field miners or owners. After the brief Land War period ending in 1864, Parliament moved to extend the franchise to more Māori. Donald McLean introduced a bill for four temporary Māori electorates and extended the franchise to all Māori men over 21 in 1867. As such, Māori were universally franchised 12 years prior to European men. In 1879 an economic depression hit, resulting in poverty and many people, especially miners, returning to Australia. Between 1879 and 1881 Government was concerned at the activities of Māori activists based on confiscated land at Parihaka. Activists destroyed settlers' farm fences and ploughed up roads and land, which incensed local farmers. Arrests followed but the activities persisted. Fears grew among settlers that the resistance campaign was a prelude to armed conflict. The Government itself was puzzled as to why the land had been confiscated and offered a huge 25,000-acre reserve to the activists, provided they stopped the destruction. Commissioners set up to investigate the issue said that the activities "could fairly be called hostile". A power struggle ensued resulting in the arrest of all the prominent leaders by a large government force in 1881. Historian Hazel Riseborough describes the event as a conflict over who had authority or mana—the Government or the Parihaka protestors. In 1882 the export of meat in the first refrigerated ship started a period of sustained economic export-led growth. This period is notable for the influence of new social ideas and movements such as the Fabians and the creation in 1890 of the first political party, the Liberals. Their leader, former gold miner Richard Seddon from Lancashire, was premier from 1893 to 1906. The Liberals introduced new taxes to break the influence of the wealthy conservative sheep farm owners. They also purchased more land from Māori. (By 1910, Māori in parts of the North Island retained very little land, and the amount of Māori land would decrease precipitously as a result of government purchases.) The early 20th century saw the rise of the trade union movement and labour parties , which represented organised workers. The West Coast town of Blackball is often regarded as the birthplace of the labour movement in New Zealand, as it was the location of the founding of one of the main political organisations which became part of the New Zealand Labour Party. Māori politics and legislation Māori political affairs have been developing through legislation such as the Resource Management Act 1991 and the Te Ture Whenua Māori Act 1993 and many more. Since colonisation in the 1800s, Māori have had their customary laws oppressed, with the imposition of a Westminster democracy and political style. As reparations from the colonial war and general discrepancies during colonisation, the New Zealand Government has formally apologised to those iwi affected, through settlements and legislation. In the 1960s Māori Politics Relations began to exhibit more positivity. The legislature enacted a law to help Māori retrieve back their land, not hinder them, through the Māori Affairs Amendment Act 1967. Since then, this progressive change in attitude has materialised as legislation to protect the natural environment or Taonga, and the courts by establishing treaty principles that always have to be considered when deciding laws in the courts. Moreover, the Māori Lands Act 2016 was printed both in te reo Māori and English—the act itself affirms the equal legal status of te reo Māori. Women in politics Women's suffrage was granted after about two decades of campaigning by women such as Kate Sheppard and Mary Ann Müller and organisations such as the New Zealand branch of the Women's Christian Temperance Union. On 19 September 1893 the governor, Lord Glasgow, signed a new Electoral Act into law. As a result, New Zealand became the first self-governing nation in the world in which all women had the right to vote in parliamentary elections. Women first voted in the 1893 election, with a high 85% turnout (compared to 70% of men). Women were not eligible to be elected to the House of Representatives until 1919 though, when three women, including Ellen Melville stood. The first woman to win an election (to the seat held by her late husband) was Elizabeth McCombs in 1933. Mabel Howard became the first female cabinet minister in 1947, being appointed to the First Labour Government. New Zealand was the first country in the world in which all the highest offices were occupied by women, between March 2005 and August 2006: the Sovereign Queen Elizabeth II, Governor-General Dame Silvia Cartwright, Prime Minister Helen Clark, Speaker of the House Margaret Wilson, and Chief Justice Dame Sian Elias. Modern political history The right-leaning National Party and the left-leaning Labour Party have dominated New Zealand political life since a Labour government came to power in 1935. During fourteen years in office (1935–1949), the Labour Party implemented a broad array of social and economic legislation, including comprehensive social security, a large scale public works programme, a forty-hour working week, and compulsory unionism. The National Party won control of the government in 1949, accepting most of Labour's welfare measures. Except for two brief periods of Labour governments in 1957–1960 and 1972–1975, National held power until 1984. The greatest challenge to the first and later Labour governments' policies on the welfare state and a regulated economy that combined state and private enterprise came from the Labour Party itself. After regaining control in 1984, the fourth Labour government instituted a series of radical market-oriented reforms. It privatised state assets and reduced the role of the state in the economy. It also instituted a number of other more left-wing reforms, such as allowing the Waitangi Tribunal to hear claims of breaches of the Treaty of Waitangi to | title to land) would be upheld in New Zealand. The nation's history, such as the legacy of the British colonial rule evidenced in the Westminster system, continues to have an impact on political culture. , New Zealand is identified as a "full democracy" in the Economist Intelligence Unit's Democracy Index. The country rates highly for civic participation in the political process, with 80% voter turnout during recent elections, compared with the average of 68%. Since the 1970s, New Zealand has shown a more socially liberal outlook. Beginning with the decriminalisation of homosexuality in 1986, successive governments have progressively increased the protection of LGBT rights, culminating in the legalisation of same-sex marriage in 2013. In 2020, the Abortion Legislation Act, that fully decriminalised abortion, was supported by members from all parties in Parliament. The idea of serving as a moral example to the world has been an important element of New Zealand national identity. The anti-apartheid movement in the 1970s and 1980s, protests against French nuclear testing at Moruroa atoll in the 1970s, and popular support for New Zealand's anti-nuclear policy in the 1980s are manifestations of this. From the 1990s New Zealand's anti-nuclear position has become a key element of government policy (irrespective of party) and of the country's "distinctive political identity". History Prior to New Zealand becoming a British colony in 1840, politics in New Zealand was dominated by Māori chiefs as leaders of hapu and iwi, utilising Māori customs as a political system. Colonial politics After the 1840 Treaty of Waitangi, a colonial governor and his small staff acted on behalf of the British Government based on the British political system. Whereas Māori systems had dominated prior to 1840, governors attempting to introduce British systems met with mixed success in Māori communities. More isolated Māori were little influenced by the Government. Most influences were felt in and around Russell, the first capital, and Auckland, the second capital. The first voting rights in New Zealand were legislated in 1852 as the New Zealand Constitution Act for the 1853 elections and reflected contemporary British practice. The electoral franchise was limited to property-owning male British subjects over 21 years old. The property qualification was relatively liberal in New Zealand compared to Britain, such that by the late 1850s 75% of adult New Zealand European males were eligible to vote compared to 20% in England and 12% in Scotland. Around 100 Māori chiefs voted in the 1853 election. During the 1850s provincial-based government was the norm. Provincial councils were abolished in 1876. Politics was initially dominated by conservative and wealthy "wool lords" who owned multiple sheep farms, mainly in Canterbury. During the gold rush era starting 1858 suffrage was extended to all British gold miners who owned a 1-pound mining license. The conservatives had been influenced by the militant action of gold miners in Victoria at Eureka. Many gold miners had moved to the New Zealand fields bringing their radical ideas. The extended franchise was modelled on the Victorian system. In 1863 the mining franchise was extended to goldfield business owners. By 1873 of the 41,500 registered voters 47% were gold field miners or owners. After the brief Land War period ending in 1864, Parliament moved to extend the franchise to more Māori. Donald McLean introduced a bill for four temporary Māori electorates and extended the franchise to all Māori men over 21 in 1867. As such, Māori were universally franchised 12 years prior to European men. In 1879 an economic depression hit, resulting in poverty and many people, especially miners, returning to Australia. Between 1879 and 1881 Government was concerned at the activities of Māori activists based on confiscated land at Parihaka. Activists destroyed settlers' farm fences and ploughed up roads and land, which incensed local farmers. Arrests followed but the activities persisted. Fears grew among settlers that the resistance campaign was a prelude to armed conflict. The Government itself was puzzled as to why the land had been confiscated and offered a huge 25,000-acre reserve to the activists, provided they stopped the destruction. Commissioners set up to investigate the issue said that the activities "could fairly be called hostile". A power struggle ensued resulting in the arrest of all the prominent leaders by a large government force in 1881. Historian Hazel Riseborough describes the event as a conflict over who had authority or mana—the Government or the Parihaka protestors. In 1882 the export of meat in the first refrigerated ship started a period of sustained economic export-led growth. This period is notable for the influence of new social ideas and movements such as the Fabians and the creation in 1890 of the first political party, the Liberals. Their leader, former gold miner Richard Seddon from Lancashire, was premier from 1893 to 1906. The Liberals introduced new taxes to break the influence of the wealthy conservative sheep farm owners. They also purchased more land from Māori. (By 1910, Māori in parts of the North Island retained very little land, and the amount of Māori land would decrease precipitously as a result of government purchases.) The early 20th century saw the rise of the trade union movement and labour parties , which represented organised workers. The West Coast town of Blackball is often regarded as the birthplace of the labour movement in New Zealand, as it was the location of the founding of one of the main political organisations which became part of the New Zealand Labour Party. Māori politics and legislation Māori political affairs have been developing through legislation such as the Resource Management Act 1991 and the Te Ture Whenua Māori Act 1993 and many more. Since colonisation in the 1800s, Māori have had their customary laws oppressed, with the imposition of a Westminster democracy and political style. As reparations from the colonial war and general discrepancies during colonisation, the New Zealand Government has formally apologised to those iwi affected, through settlements and legislation. In the 1960s Māori Politics Relations began to exhibit more positivity. The legislature enacted a law to help Māori retrieve back their land, not hinder them, through the Māori Affairs Amendment Act 1967. Since then, this progressive change in attitude has materialised as legislation to protect the natural environment or Taonga, and the courts by establishing treaty principles that always have to be considered when deciding laws in the courts. Moreover, the Māori Lands Act 2016 was printed both in te reo Māori and English—the act itself affirms the equal legal status of te reo Māori. Women in politics Women's suffrage was granted after about two decades of campaigning by women such as Kate Sheppard and Mary Ann Müller and organisations such as the New Zealand branch of the Women's Christian Temperance Union. On 19 September 1893 the governor, Lord Glasgow, signed a new Electoral Act into law. As a result, New Zealand became the first self-governing nation in the world in which all women had the right to vote in parliamentary elections. Women first voted in the 1893 election, with a high 85% turnout (compared to 70% of men). Women were not eligible to be elected to the House of Representatives until 1919 though, when three women, including Ellen Melville stood. The first woman to win an election (to the seat held by her late husband) was Elizabeth McCombs in 1933. Mabel Howard became the first female cabinet minister in 1947, being appointed to the First Labour Government. New Zealand was the first country in the world in which all the highest offices were occupied by women, between March 2005 and August 2006: the Sovereign Queen Elizabeth II, Governor-General Dame Silvia Cartwright, Prime Minister Helen Clark, Speaker of the House Margaret Wilson, and Chief Justice Dame Sian Elias. Modern political history The right-leaning National Party and the left-leaning Labour Party have dominated New Zealand political life since a Labour government came to power in 1935. During fourteen years in office (1935–1949), |
the least corrupt countries in the world, corruption still exists in New Zealand. Regional economies In 2015 Statistics New Zealand published details of the break-down of gross domestic product in the regions of New Zealand for the year ended March 2015: Unemployment Prior to the economic shock created by Britain's decision to join the EEC in 1973, which removed the UK as New Zealand's primary market for exports, unemployment in New Zealand was very low. A recession and a collapse in wool prices in 1966 led to unemployment rising by 131%, but still represented only a 0.7% percentage point increase in the unemployment rate. After 1973, unemployment became a persistent economic and social issue in New Zealand. Recessions from 1976 to 1978 and from 1982 to 1983 greatly increased unemployment again. Between 1985 and 2012 the unemployment rate averaged 6.29%. After the stock market crash of 1987, unemployment rose 170% reaching an all-time high of 11.20% in September 1991. The Asian financial crisis of 1997 sent unemployment upwards again, by 28%. By 2007 it had dropped again and the rate stood at 3.5% (December 2007), its lowest level since the current method of surveying began in 1986. This gave the country the 5th-best ranking in the OECD (with an OECD average at the time of 5.5%). The low numbers correlated with a robust economy and a large backlog of job positions at all levels. Unemployment numbers are not always directly comparable between OECD nations, as members do not all keep labour market statistics in the same way. The percentage of the population employed has also increased in recent years, to 68.8% of all inhabitants, with full-time jobs increasing slightly, and part-time occupations decreasing in turn. The increase in the working population percentage is attributed to increasing wages and higher costs of living moving more people into employment. The low unemployment also had some disadvantages, with many companies unable to fill jobs. From December 2007, mainly as a result of the global financial crisis, unemployment numbers began to rise. This trend continued until September 2012, reaching a high of 6.7%. They began to recover after that point, sitting at 3.9% . Housing affordability Shamubeel Eaqub, formerly a principal economist at the New Zealand Institute of Economic Research (NZIER), said in 2014 that thirty years prior, an average house in New Zealand cost two or three times the average household income. House prices rose dramatically in the first years of the 21st century and by 2007, an average house cost more than six times household income. International surveys in 2013 showed that housing was unaffordable in all eight of New Zealand's major markets – "unaffordable" being defined as house prices which are more than three times the median regional income. Demand for property has been strongest in Auckland, pushing up prices in the city by 52% in the last five years. In 2014 the average sales price there went from $619,136 to $696,047, a rise of 12% in that 12-month period alone. In 2015, prices rose another 14%. This makes Auckland New Zealand's least affordable market and one of the most expensive cities in the world with houses costing 8 times the average income. Between 2012 and April 2016, the average Auckland home increased in price by just over two-thirds reaching $931,000 – higher than the cost of an average home in Sydney. As a result, more and more people are being pushed out of the property market. Those on low incomes are hardest hit, affecting many Maori and Pacific Islanders. New Zealand's relatively high mortgage-rates are exacerbating the problem making it difficult for young people with steady jobs to buy their first home. According to a 2012 submission made to the Housing Affordability Inquiry, escalating house prices are also impacting on many middle income groups, especially those with large families. Mortgage adviser Bruce Patten said the trend was "disturbing" and added to the gap between the "haves and have-nots". Property-analysis company CoreLogic says that 45% of house purchases in New Zealand are now made by investors who already own a home, while another 28% are made by people moving from one property to another. Approximately 8% of purchases go to overseas-based cash buyers - primarily Australians, Chinese, and British – although most economists believe that foreign investment is currently too small to have a significant effect on property prices. Whether purchases are made by New Zealanders or by foreigners, it is generally those who are already well off who are buying the bulk of properties on the market. This has had a dramatic effect on home-ownership rates by Kiwis, now at its lowest level since 1951. Even as recently as 1991, 76% of New Zealand homes were occupied by their owners. By 2013, this had reduced to 63%, indicating that more and more people are having to rent. Raewyn Cox, chief executive of the Federation of Family Budgeting, says: "High prices and high interest rates (have) sentenced a rising number of New Zealanders to be lifetime tenants" where they are "stuck in expensive rental situations, heading towards retirement." Inequality Between 1982 and 2011 New Zealand's gross domestic product grew by 35%. Almost half of that increase went to a small group who were already the richest in the country. During this period, the average income of the top 10% of earners in New Zealand (those earning more than $72,000) almost doubled going from $56,300 to $100,200. The average income of the poorest tenth increased by 13% from $9700 to $11,000. Statistics New Zealand, which keeps track of income disparity using the P80/20 ratio, confirms the increase in income inequality. The ratio shows the difference between high household incomes (those in the 80th percentile) and low household incomes (those in the 20th percentile). The inequality ratio increased between 1988 and 2004, and decreased until the onset of the Global Financial Crisis in 2008, increasing again to 2011 and then declining again from then. By 2013 the disposable income of high-income households was more than two-and-a-half times larger than that of low-income households. Highlighting the disparity, the top 1% of the population now owns 16% of the country's wealth – the richest at one point 5% owned 38% – while half the population, including beneficiaries and pensioners, earn less than $24,000. Superannuation New Zealand has a universal superannuation scheme. Everyone aged 65 years old or over, who is a New Zealand citizen or permanent resident and who normally lives in New Zealand at the time they apply is eligible. They must also have lived in New Zealand for at least 10 years since they turned 20 with five of those years being since they turned 50. Time spent overseas in certain countries and for certain reasons may be counted for New Zealand superannuation. New Zealand superannuation is taxed, the rate of which depends on superannuitants' other income. The amount of superannuation paid depends on the person's household situation. For a married couple the net tax amount is set by legislation to be no less than 66% of net average wage. Because of the growing number of elderly becoming eligible, superannuation costs rose from $7.3 billion a year in 2008 to $10.2 billion in 2014. In 2011 there were twice as many children in New Zealand as elderly (65 and over); by 2051 there are projected to be 60% more elderly than children. In the ten years from 2014, the number of New Zealanders over the age of 65 was projected to grow by about 200,000. This poses a significant problem for superannuation. The government gradually increased the age of eligibility from 61 to 65 between 1993 and 2001. In that year the Labour Government of Helen Clark introduced the New Zealand Superannuation Fund (known as the "Cullen Fund" after Minister of Finance Michael Cullen) to part-fund the superannuation scheme into the future. As at October 2014, the fund managed NZ$27.11 billion, 15.9% of which it invested within New Zealand. In 2007 the same Government introduced a new individual saving-scheme, known as KiwiSaver. KiwiSaver principally targets growing people's retirement savings, but younger participants can also use it to save a deposit for their first home. The scheme is voluntary, work-based and managed by private-sector companies called "KiwiSaver providers". KiwiSaver had 2.3 million active members (60.9% of New Zealand's population under 65). NZ$4 billion was contributed annually, and a total of NZ$19.1 billion has been contributed since 2007. Consumption New Zealanders see themselves as first-world consumers with first-world tastes and habits - mitigated only slightly by the country's remoteness from main global producers. Infrastructure According to the National Infrastructure Unit of the Treasury, New Zealand "...continues to face challenges to its infrastructure; all forms of infrastructure are long-term investments, and change does not come about easily or quickly." A report prepared for the Association of Consulting and Engineering New Zealand in 2020 claimed that there was an infrastructure deficit of $75 billion (about one quarter of GDP), following decades of under-investment that began in the 1980s. Transport New Zealand's transport infrastructure is "generally well developed." Road network The New Zealand state highway network consists of 11,000 km of road, with 5981.3 km in the North Island and 4924.4 km in the South Island, built and maintained by the NZ Transport Agency, and paid for from general taxation and fuel excise duty. Heavy road users must pay Road User Charges as well, there is limited use of tolling on state highways. There is also 83,000 km of local roads built and maintained by local authorities. Railway network The railway network is owned by state-owned enterprise KiwiRail and consists of 3,898 km of railway line, built to the narrow gauge of . Of this, 506 km is electrified. Airways There are seven international airports and twenty-eight domestic airports. Air New Zealand, 52% government-owned, is the national carrier and a state owned enterprise. Airways New Zealand, another state owned enterprise, provides air traffic control and communications. Seaports New Zealand has 14 international seaports. Telecommunications Present-day telecommunications in New Zealand include telephony, radio, television, and internet usage. A competitive telecommunications market has seen mobile prices drop to some of the lowest in the OECD. The copper wire and fibre cable networks are mostly owned by Chorus Limited, a publicly listed company. Chorus wholesales services to retail providers (such as Spark). In the mobile sector, there are three operators: Spark, Vodafone and 2degrees. Internet New Zealand has a high rate of internet use. , there are 1,916,000 broadband connections and 65,000 dial-up connections in New Zealand, of which 1,595,000 are residential and 386,000 are business or government. The majority of connections are digital subscriber line over phone lines. The Government has two plans to bring Ultra-Fast Broadband to 97.8% of the population by 2019, and is spending NZ$1.35 billion on public-private partnerships to roll out fibre-to-the-home connection in all main towns and cities with population over 10,000. The program aims to deliver ultra-fast broadband capable of at least 100 Mbit/s download and 50Mbit/s upload to 75% of New Zealanders by 2019. In total, 1,340,000 households in 26 towns and cities will be connected. Gigabit internet (1000Mbit/s download speeds) was made available to the entire Ultra-Fast Broadband (UFB) footprint on 1 October 2016, in an announcement from Chorus. A$300 million Rural Broadband Initiative (RBI) has also been introduced by the Government, with the aim to bring broadband of at least 5Mbit/s to 86% of rural customers by 2016. Energy From 1995 to 2013, the energy intensity of the economy per unit of GDP declined by 25 percent. A contributing factor is the growth of relatively less energy-intensive service industries. New Zealand will be potentially among the main winners after the global transition to renewable energy is completed; the country is placed very high – no. 5 among 156 countries – in the index of geopolitical gains and losses after energy transition (GeGaLo Index). Electricity The electricity market is regulated by the Electricity Industry Participation Code administered by the Electricity Authority (EA). The electricity sector uses mainly renewable energy sources such as hydropower, geothermal power and increasingly wind energy. The 83% share of renewable energy sources makes New Zealand one of the most sustainable economies in terms of energy generation. New Zealand suffers from a geographical imbalance between electricity production and consumption. The most substantial electricity generation (both existing and as remaining potential) is located on the South Island and to a lesser degree in the central North Island, while the main demand (which is continuing to grow) is in the northern North Island, particularly the Auckland Region. This requires electricity to be transmitted north through a power grid which is reaching its capacity more often. Water As at 2021, almost all of the three waters assets (drinking water, stormwater and wastewater) are owned by local councils and territorial authorities. There are currently 67 different asset-owning organisations in total. The challenges for local government include funding infrastructure deficits and preparing for large re-investments that are estimated to require $110billion over the next 30 to 40 years. There are also significant challenges in meeting statutory requirements for the safety of drinking water, and the environmental expectations for management of stormwater and wastewater. Climate change adaptation, and providing for population growth add to these challenges. A nationwide reform programme is underway, with the intention of amalgamating the three waters assets into a small number of large regional publicly owned utilities. History For many years New Zealand's economy was built on a narrow range of agricultural products, such as wool, meat and dairy. These products became New Zealand's staple and most valuable exports, underpinning the success of the economy, from the 1850s until the 1970s. For example, from 1920 to the late 1930s, the dairy export quota was usually around 35% of New Zealand's total exports, and in some years made up almost 45%. Due to the high demand for these primary products, manifested by the New Zealand wool boom of 1951, New Zealand had one of the highest standards of living in the world for 70 years. In the 1960s, prices for these traditional exports declined, and in 1973 New Zealand lost its preferential trading position with the United Kingdom when the latter joined the European Economic Community. Partly as a result, from 1970 to 1990, the relative New Zealand GDP per capita adjusted for purchasing power declined from about 115% of the OECD average to 80%. Between 1984 and 1993, New Zealand changed from a somewhat closed and centrally controlled economy to one of the most open economies in the OECD. In a process often referred to in New Zealand as Rogernomics, successive governments introduced policies which dramatically liberalised the economy. In 2005 the World Bank praised New Zealand as the most business-friendly country in the world. The economy diversified and by 2008, tourism had become the single biggest generator of foreign exchange. Early years Prior to European settlement and colonisation of New Zealand, Māori had a subsistence economy, the basic economic unit of which was the sub-tribe or hapū. From the 1790s, the waters around New Zealand were visited by British, French and American whaling, sealing and trading ships. Their crews traded European goods, including guns and metal tools, for Māori food, water, wood, flax and sex. Their increasing lawlessness and plans for formal settlement by the New Zealand Company were two of the drivers behind the signing of the Treaty of Waitangi in 1840, which established New Zealand as a colony. Settlers continued to be dependent on Māori for food until the 1860s. From then immigrants became self-sufficient in farming, and started quarrying a variety of minerals including gold, which was discovered at Gabriel's Gully in Central Otago, leading to the Otago Gold Rush in 1861. Settlements flourished in areas where these quarries were established. In the 1880s, Dunedin became the richest city in the country largely on the back of investments from the gold rush. Sheep farming began in the Wairarapa but soon spread up and down the east coast from Southland to the East Cape once rudimentary roads and transport became available. Much of the land used for farming was taken or leased from Māori. Sheep numbers grew quickly and by the mid-1850s, there were already a million sheep in New Zealand; by the early 1870s, there were 10 million. Wool became the first staple export, initially exported from the Wellington settlement in the late 1850s, although unrefrigerated meat and dairy products were exported as far as Australia. In the 1870s, Julius Vogel was periodically both colonial treasurer and premier. He viewed New Zealand as a "Britain of the South Seas" and began the development of infrastructure in New Zealand investing heavily in roads, railways, telegraphs and bridges funded by public borrowing. Progress slowed after the collapse of the City of Glasgow Bank in 1878 which led to a contraction in credit from London, the centre of the world's financial system at the time. Economic activity was depressed for some years afterwards, until refrigeration was introduced in 1882. This enabled New Zealand to start exporting meat and other frozen products to the United Kingdom. Refrigeration transformed and shaped the development of the economy but, in the process, established New Zealand's economic dependence on Britain. The success of refrigeration was directly related to the growth and development of farming in the country. In the 19th century, the bulk of economic activity was in the South Island of New Zealand. From around 1900, dairy farming became increasingly viable in areas which were less suitable for sheep, particularly in Northland, the Waikato and Taranaki. As dairying developed, the North Island slowly became more important to the economy. As more land was cultivated and farmed, Britain became the sole market for New Zealand meat and animal products. The dairy farming can therefore be seen as a response to the powerful market demands in Europe, transforming not only New Zealand's countryside, economy and production techniques, but also causing migration in order to create the needed supply of dairy farming. 20th century The Reserve Bank of New Zealand was established as New Zealand's central bank on 1 August 1934. Up until that time New Zealand's monetary policy had been set in the United Kingdom, and the New Zealand Pound was issued by private banks. A separate central bank gave New Zealand's government control of monetary policy for the first time, although New Zealand remained part of the sterling area by pegging its pound to the British pound sterling until the introduction of the New Zealand dollar in 1967, after which the dollar was instead pegged to the United States dollar. By the mid-20th century, pastoral-farming products made up more than 90% of New Zealand's exports, 65% of which was going to Britain in the 1950s. Having a secure market with guaranteed prices also enabled New Zealand to impose high tariffs on imported goods from other countries. Tough import controls gave local manufacturers the ability to produce similar products locally, broaden the base of jobs available in New Zealand and still compete against higher priced imports. This prosperity continued up to 1955 at which point Britain stopped giving New Zealand guaranteed prices for its exports. From then on, what New Zealand received was dictated by the free market. As a result, during the 1950s and 1960s the country's standard of living began to slip as the export sector was no longer able to pay for the level of imported goods required to meet the country's growing consumerism. Britain applied to join the European Economic Community (EEC) in 1961, but was vetoed by the French. The government of Keith Holyoake reacted by attempting to diversify New Zealand's export markets, signing the first free trade agreement (Australia New Zealand Free Trade Agreement) in 1965, and opening new diplomatic posts in Hong Kong, Jakarta, Saigon, Los Angeles and San Francisco. Britain applied again to join the EEC in 1967, and entered into negotiations for membership in 1970. Holyoake's deputy and successor, Jack Marshall, (briefly Prime Minister in 1972) negotiated continued access for New Zealand exports to the United Kingdom under the so-called "Luxembourg Agreement". Britain gained full membership of the EEC on 1 January 1973, and all trade agreements with New Zealand came to an end, except the Luxembourg Agreement. By the end of that year, only 26.8% of New Zealand's exports were to Britain. This had a significant effect on the standard of living. In 1953, New Zealand had the third highest standard in the world. By 1978, it had dropped to 22nd place. Having lost unrestricted access to its traditional market, New Zealand continued to search for alternative export markets and diversify its economy. The Government of Norman Kirk, who succeeded Marshall, put greater emphasis on expanding New Zealand's trade, especially with South East Asia. Following the Yom Kippur War in October 1973, an oil embargo was put in place by the Middle Eastern oil exporters, leading to the 1973 oil crisis. This compounded New Zealand's dire economic situation further. Inflation greatly increased as the cost of transport and imported goods soared, causing standards of living to decline. Think Big Following the 1979 energy crisis resulting from the Iranian Revolution of that year, Robert Muldoon, the prime minister between 1975 and 1984, instituted an economic strategy known as Think Big. Large scale industrial plants were established based on New Zealand's abundant natural gas. A new range of products for export such as ammonia, urea fertilizer, methanol and petrol were produced and with greater use of electricity (with the electrification of the North Island Main Trunk railway) with the goal that this would reduce New Zealand's dependence on oil imports. Other projects included the Clyde Dam on the Clutha River, which was built to meet a growing demand for electricity, and the expansion of the New Zealand Steel plant at Glenbrook. The Tiwai Point Aluminium Smelter, which opened in 1971, was also upgraded as part of the Think Big strategy and now brings in approximately NZ$1 billion in exports every year. Unfortunately for New Zealand, most of these projects only came on line at the same time as oil prices dropped during the 1980s oil glut. The price of crude went from more than US$90 a barrel in 1980, to about US$30 a few years later. Because these Think Big projects required massive borrowing to get started, public debt soared from $4.2 billion in 1975 when Muldoon became prime minister to $21.9 billion when he left office nine years later. Inflation remained rampant, averaging 11% in the 1980s. Once Labour came to power in 1984, many of these projects were sold to private industry as part of a wider sale of state-assets. The Muldoon Government did make some moves towards deregulation however. For example, in 1982 it removed the transport licensing restrictions on road carriers carting goods more than 150 km, and turned the Railways Department into a statutory corporation. Rogernomics The Fourth Labour government, elected in July 1984, moved away from government intervention in the economy and allowed free market mechanisms to dominate. These reforms became known as "Rogernomics", named after minister of finance from 1984 to 1988, Roger Douglas. The changes included making the Reserve Bank independent of political decisions; performance contracts for senior civil servants; public sector finance reform based on accrual accounting; tax neutrality; subsidy-free agriculture; and industry-neutral competition regulation. Government subsidies including agricultural subsidies were eliminated; import regulations were loosened up; the exchange rate was floated; and controls on interest rates, wages, and prices were removed; and personal rates of taxation were reduced. Tight monetary policy and major efforts to reduce the government budget deficit brought the inflation rate down from an annual rate of more than 18% in 1987. The deregulation of government-owned enterprises in the 1980s and 1990s reduced government's role in the economy and permitted the retirement of some public debt. The new Government was faced with an exchange rate crisis the day after it was elected. Speculators expected the change of government to result in a 20% devaluation of the New Zealand dollar, which led to the | oil glut. The price of crude went from more than US$90 a barrel in 1980, to about US$30 a few years later. Because these Think Big projects required massive borrowing to get started, public debt soared from $4.2 billion in 1975 when Muldoon became prime minister to $21.9 billion when he left office nine years later. Inflation remained rampant, averaging 11% in the 1980s. Once Labour came to power in 1984, many of these projects were sold to private industry as part of a wider sale of state-assets. The Muldoon Government did make some moves towards deregulation however. For example, in 1982 it removed the transport licensing restrictions on road carriers carting goods more than 150 km, and turned the Railways Department into a statutory corporation. Rogernomics The Fourth Labour government, elected in July 1984, moved away from government intervention in the economy and allowed free market mechanisms to dominate. These reforms became known as "Rogernomics", named after minister of finance from 1984 to 1988, Roger Douglas. The changes included making the Reserve Bank independent of political decisions; performance contracts for senior civil servants; public sector finance reform based on accrual accounting; tax neutrality; subsidy-free agriculture; and industry-neutral competition regulation. Government subsidies including agricultural subsidies were eliminated; import regulations were loosened up; the exchange rate was floated; and controls on interest rates, wages, and prices were removed; and personal rates of taxation were reduced. Tight monetary policy and major efforts to reduce the government budget deficit brought the inflation rate down from an annual rate of more than 18% in 1987. The deregulation of government-owned enterprises in the 1980s and 1990s reduced government's role in the economy and permitted the retirement of some public debt. The new Government was faced with an exchange rate crisis the day after it was elected. Speculators expected the change of government to result in a 20% devaluation of the New Zealand dollar, which led to the 1984 New Zealand constitutional crisis due to Muldoon's refusal to devalue, worsening the currency crisis further. As a result, the dollar was floated on 4 March 1985, allowing for the value of the dollar to change with the market. Prior to the dollar being floated, the dollar was pegged against a basket of currencies. Financial markets were deregulated and tariffs on imported goods lowered and phased out. At the same time subsidies to many industries, notably agriculture, were removed or significantly reduced. Income and company taxes were reduced and the top marginal tax rate was reduced from 66% to 33%. These were replaced by a comprehensive tax on goods and services (GST) initially set at 10%, then increased to 12.5% and recently increased to 15% in 2011. A surtax on universal superannuation was also introduced. Many government departments were corporatised, and from 1 April 1987 became State owned enterprises, required to make a profit. The new corporations shed thousands of jobs adding to unemployment; Electricity Corporation 3,000; Coal Corporation 4,000; Forestry Corporation 5,000; New Zealand Post 8,000. The wage and price freeze of the early eighties coupled with the removal of financial restrictions and a lack of investment opportunities, led to a speculative bubble on New Zealand's sharemarket, sharemarket crash of 1987, in which New Zealand's sharemarket shed 60% from its 1987 peak, and taking several years to recover. Inflation continued to be a major problem afflicting the New Zealand economy. Between 1985 and 1992, inflation averaged 9% per year and the economy was in recession. The unemployment rate rose from 3.6% to 11%, New Zealand's credit rating dropped twice, and foreign debt quadrupled. In 1989 the Reserve Bank Act 1989 was passed, creating strict monetary policy under the sole control of the Reserve Bank Governor. From then on the Reserve Bank focused on keeping inflation low and stable, using the Official Cash Rate (OCR) – the price of borrowing money in New Zealand – as its primary means to do so. As a result, inflation rates fell to an average of 2.5% in the 1990s, compared to 12% in the 1970s. However, the tightening of monetary policy contributed to rising unemployment in the early 1990s. The Labour Party was greatly divided over Rogernomics, especially following the 1987 sharemarket crash and its effect on the economy, which slumped along with the rest of the world into recession in the early 1990s. The National Party was returned to power at the 1990 general election and Ruth Richardson became minister of finance under Prime Minister Jim Bolger. The new Government was again thrown a major economic challenge, with the then state-owned Bank of New Zealand needing a bail-out to stay operational. Richardson's first budget in 1991, nicknamed the 'Mother of all Budgets', attempted to address constant fiscal deficits and borrowing by cutting state spending. Unemployment and social welfare benefits were cut and 'market rents' were introduced for state houses – in some cases tripling the rents of low-income people. Richardson also introduced user-pays requirements in hospitals and schools. These reforms became known derisively as Ruthanasia. By this time, New Zealand's economy faced serious social problems; the number of New Zealanders estimated to be living in poverty grew by at least 35% between 1989 and 1992; many of the promised economic benefits of the experiment never materialised. Gross domestic product per capita stagnated between 1986–87 and 1993–94, and by March 1992 unemployment rose to 11.1% Between 1985 and 1992, New Zealand's economy grew by 4.7% during the same period in which the average OECD nation grew by 28.2%. From 1984 to 1993 inflation averaged 9% per year, New Zealand's credit rating dropped twice, and foreign debt quadrupled. Between 1986 and 1993, the unemployment rate rose from 3.6% to 11%. Deregulation also created a business-friendly regulatory framework which has benefited those able to take advantage of it. A 2008 survey in The Heritage Foundation and The Wall Street Journal ranked New Zealand 99.9% in "Business freedom", and 80% overall in "Economic freedom", noting that it takes, on average, only 12 days to establish a business in New Zealand, compared with a worldwide average of 43 days. Deregulation has also been blamed for other significant negative effects. One of these was the leaky homes crisis, whereby the loosening up of building standards (in the expectation that market forces would assure quality) led to many thousands of severely deficient buildings, mostly residential homes and apartments, being constructed over a period of a decade. The costs of fixing the damage has been estimated at over NZ$11 billion (). 21st century Unemployment continued to fall from 1993 to 1994 fiscal year, until the onset of the 1997 Asian financial crisis again pushed the rate higher. By 2016 the unemployment rate decreased to 5.3 percent, the lowest level in 7 years. Between 2000 and 2007, the New Zealand economy expanded by an average of 3.5% a year driven primarily by private consumption and the buoyant housing market. During this period, inflation averaged only 2.6% a year, within the Reserve Bank's target range of 1% to 3%. However, in early 2008 the economy entered recession, before the effects of the global financial crisis (GFC) set in later that year. A drought over the 2007/08 summer led to lower production of dairy products in the first half of 2008. Domestic activity slowed sharply over 2008 as high fuel and food prices dampened domestic consumption, while high interest rates and falling house prices drove a rapid decline in residential investment. Around the world instability was developing in the finance sector. This reached a peak in September 2008 when Lehman Brothers, a major American bank, collapsed propelling the world into the global financial crisis. Finance company collapses Uncertainty began to dominate the global financial and economic environment. Business and consumer confidence in New Zealand plummeted as dozens of finance companies collapsed. To try and stop a flight of funds from New Zealand institutions to those in Australia, the Government established the Crown Retail Deposit Guarantee Scheme to cover depositors funds in the event that a bank or finance company went broke. This protected some investors but nevertheless, at least 67 finance companies collapsed within a short period of time. The largest of these was South Canterbury Finance which cost taxpayers NZ$1.58 billion when the company collapsed in August 2010. The directors of many of these finance companies were subsequently investigated for fraud and some high-profile directors went to prison. In an attempt to stimulate the economy, the Reserve Bank lowered the Official Cash Rate (OCR) from a high of 8.25% (July 2008) to an all-time low of 2.5% at the end of April 2009. "Rock star" economy Fortunately for New Zealand, the recession was relatively shallow compared to many other nations in the OECD, it was sixth least affected out of the 34 member nations with negative real GDP growth totaling 3.5%. In 2009 the economy picked up, led by strong demand from major trading partners Australia and China, and historically high prices for New Zealand's dairy and log exports. In 2010 the GDP grew by a modest 1.6%, but over the next couple of years economic activity continued to improve, driven by the rebuild in Canterbury after the Christchurch earthquakes and recovery in domestic demand. Through 2011, global conditions deteriorated and the terms of trade eased off their 2011 peak, continuing to moderate until September 2012. Since then, commodity prices have rebounded strongly, with strong demand from China and the international situation improving. Commodity prices have been at record highs in recent quarters and remain elevated. High commodity prices are expected to provide a considerable boost to nominal GDP growth in the near term. In 2013 the economy grew 3.3%. HSBC chief economist for Australia and New Zealand, Paul Bloxham, was so impressed that he predicted New Zealand's growth would outpace most of its peers, and he described New Zealand as the "rock star economy of 2014". Another financial commentator said the New Zealand dollar was the "hottest" currency of 2014. Only three months later, the New Zealand Productivity Commission expressed concern about low living standards and problems affecting the long-term drivers of growth. Paul Conway, Director of Economics and Research at the Productivity Commission, wrote: "New Zealand's broad policy settings should generate GDP per capita 20 per cent above the OECD average, but the actual result is more than 20 per cent below average. We may be punching above our weight, but that's only because we are in the wrong weight division!" In August, Bloxham admitted that "the sharp decline in dairy prices over the last six months has clouded the outlook somewhat". In December however Bloxham stated that he thought the New Zealand economy would continue to grow strongly. In 2014 increased attention was paid to the growing gap between rich and poor. In The Guardian, Max Rashbrook said policies implemented by both Labour and National governments have increased inequality. He claims that for twenty years outrage "has been muted", but "Alarm bells are finally beginning to sound. Recent polling shows three-quarters of New Zealanders think theirs is no longer an egalitarian country". COVID-19 pandemic New Zealand recorded its first case of COVID-19 on 28 February 2020. In response to the pandemic, the country closed its borders to everyone except New Zealand citizens and residents on 19 March, and went into full (Level 4) lockdown from 26 March to 27 April, followed by a partial (Level 3) lockdown from 28 April to 13 May. The border closure combined with the lockdowns saw the retail, accommodation, hospitality, and transport sectors experiencing major declines. On 17 September 2020, New Zealand officially entered a recession, with the country's gross domestic product retracting by 12.2% in the June quarter. The GDP rebounded 14% in the September quarter to leave a 2.2% year-on-year retraction. After successfully containing the virus, the New Zealand economy had sharp growth in what is known as a V-shaped recovery and ended the year with an overall economic expansion of 0.4%, better than the predicted 1.7% contraction. Unemployment also dropped to 4.9% in December 2020, down from a peak Covid effected rate of 5.3% in September. By 23 September 2021, the Restaurant Association's Chief executive Marisa Bidois estimated that about 1,000 hospitality businesses nationwide had been forced to close as a result of the COVID-19 pandemic, leading to the loss of 13,000 jobs. In response, the Association lobbied the Government for Government for continued wage subsidies and incentives to boost customer rates. On 13 November 2021, the Bay of Plenty Times reported that 26,774 companies had been liquidated during the first eight months of 2021. On 27 January 2022, New Zealand's inflation rate hit a 30-year record high of 5.9% at the end of 2021. According to figures released by Statistics New Zealand, the rising cost of construction, petrol and rents pushed the consumer price index up 1.4 per cent between October and December 2021. Statistics NZ also recorded a two percent increase in household utilities expenses, which was fuelled by the rising costs of new dwellings (which rose by 16% from 2020) and a 30 percent hike in fuel prices (from NZ $1.87 per litre to $2.45 per litre). Prime Minister Jacinda Ardern attributed the sharp inflation rate to rising crude oil prices overseas. By contrast, the opposition National Party leader Christopher Luxon and Finance spokesperson Simon Bridges attributed rising inflation to the Government's alleged "wasteful" spending. On 1 February 2022, an annual report released by the Organisation of Economic Cooperation and Development (OECD) identified the country's border restrictions and declining house prices as the main risks facing New Zealand's economy that year. While the OECD report credited New Zealand's elimination strategy and macroeconomic stimuli such as wage and socio-economic subsidies with helping the economy to bounce back to pre-COVID-19 levels, it also warned that excessive Government spending was causing the economy to overheat and substantial increases in household and government debt. The OECD welcomed the Reserve Bank's decision to raise interest rates but also urged the Government to raise the superannuation age, eliminate obstacles to building houses, and reduce government spending. The OECD also supported the introduction of a social insurance scheme for unemployed workers. Trade New Zealand's small size and long distances from major world markets creates significant challenges in its ability to compete in global markets. Australia, New Zealand's closest neighbour, is New Zealand's biggest trading partner. In 2018, New Zealand's main trading partners were China, Australia, the European Union, the United States and Japan. Together, these five partners account for 66% of New Zealand's two-way trade. In March 2014, the total value of goods exported from New Zealand topped $50 billion for the first time, up from $30 billion in 2001. New Zealand Trade and Enterprise (NZTE) offers strategic advice and support to New Zealand businesses wanting to export goods and services to other countries. Trade agreements Since 1960s New Zealand has pursued free trade agreements with many countries to diversify its export markets and increase the competitiveness of New Zealand's exports to the world. As well as reducing barriers to trade, Trade Agreements New Zealand has entered into are designed to ensure existing access is maintained. Trade agreements establish rules by which trade can take place and ensure regulators and officials in countries New Zealand is trading with work closely together. Australia Australia is New Zealand's largest bilateral trading partner. In 2013, trade between New Zealand and Australia was worth NZ$25.6 billion. Economic and trading links between Australia and New Zealand are underpinned by the "Closer Economic Relations" (CER) agreement, which allows for free trade in goods and most services. Since 1990, CER has created a single market of more than 25 million people. Australia is now the destination of 19% of New Zealand's exports, including light crude oil, gold, wine, cheese and timber, as well as a wide range of manufactured items. The CER also creates a free labour market which allows New Zealand and Australian citizens to live and work freely in each other's country together with mutual recognition of professional qualifications. This means individuals who are registered to practise an occupation in one country can register to practise an equivalent occupation in the other country. Banking regulation and supervision are co-ordinated through the Trans-Tasman Council on Banking Supervision and there are also ongoing discussions about co-ordinating Australian and New Zealand business law. China China is New Zealand's largest trading partner buying primarily meat, dairy products and pine logs. In 2013, trade between New Zealand and China was worth NZ$16.8 billion. This has occurred primarily because of soaring demand for imported dairy products, following the Chinese milk scandal in 2008. Since then demand for milk products has been so strong that in the 12 months to March 2014, there was a 51% increase in total exports to China. The increase was facilitated by the New Zealand–China Free Trade Agreement which came into force on 1 October 2008. Since that year exports to China have more than tripled. United States The United States is New Zealand's third largest trading partner. In 2013, bilateral trade between the two countries was valued at NZ$11.8 billion. New Zealand's main exports to the United States are beef, dairy products and lamb. Imports from the US include specialised machinery, pharmaceutical products, oil and fuel. In addition to trade, there is a high level of corporate and individual investment between the two countries and the US is a major source of tourists coming to New Zealand. In March 2012, the United States had a total of $44 billion invested in New Zealand. A number of US companies have subsidiary branches in New Zealand. Many operate through local agents, with some joint venture associations. The United States Chamber of Commerce is active in New Zealand, with a main office in Auckland and a branch committee in Wellington. According to the Ministry of Foreign Affairs, New Zealand and the United States "share a deep and longstanding friendship based on a common heritage, shared values and interests, and a commitment to promoting a free, democratic, secure and prosperous world". This common background has not translated into a free trade agreement between the two countries. Japan and other Asian economies Japan is New Zealand's fourth largest trading partner. In the 21st century, Asian economies have been developing rapidly providing significant demand for New Zealand's exports. New Zealand also trades with Taiwan, Hong Kong, Malaysia, Indonesia, Singapore, Thailand, India and the Philippines and this now accounts for around 16% of total exports. New Zealand initiated a free trade agreement with Singapore in September 2000 which was extended in 2005 to include Chile and Brunei and is now known as the P4 agreement. European Union A growing number of New Zealand companies use the United Kingdom as a base to supply their products to the European market. However trade with the European Union is declining as demand from Asia continues to grow. The EU currently takes only 8% of New Zealand exports but provides around 12% of |
the development of information and communications infrastructure. Telephones Country calling code: 64 The same code is also used to reach Scott Base in Antarctica and the United States base McMurdo Station nearby. Mobile phone system Number of mobile connections: 4.7 million (2010) Coverage available to approx 97% of the population. Operators: 2degrees (operating UMTS and LTE) Virtual network operators: Warehouse Mobile (owned by The Warehouse Group) Spark New Zealand (operating UMTS, HSDPA and LTE) Virtual network operators: Skinny (owned by Spark NZ), Digital Island, Vocus (previously CallPlus) Compass Flexiroam, Vodafone New Zealand (operating GSM, UMTS, HSDPA and LTE) Virtual network operators: Flexiroam, Black+White, M2, Kogan Mobile NZ Fixed-line telephone system Number of fixed line connections: 1.92 million (2000) Individual lines available to 99% of residences. VoIP Cloud Based Voice services are now mainstream. Traditional Copper line Operators: Chorus Limited: A large numbers of ISPs (referred to as "retail service providers") retail Chorus' connections to personal and business customers. As a wholesaler, Chorus does not retail internet connections to end users. Cable and microwave links Domestic: Optical fibre and microwave links between cities Submarine optical fibre cables between the North Island and the South Island International: Submarine cables: Hawaiki Cable (launched July 2018) Southern Cross Cable (to Australia and Hawaii) TASMAN 2 (to Australia) Tasman Global Access (to Australia, completed March 2017) Moana Cable (proposed) Satellite earth stations: 2 Intelsat (Pacific Ocean) Radio Radio broadcast stations: AM 124, FM 290, shortwave 4 (1998), 4 on Freeview digital satellite. See also: List of radio stations in New Zealand Radios: 3.75 million (1997) Television Television broadcast stations: 41 (plus 52 medium-power repeaters and over 650 low-power repeaters) (1997) These transmit 4 nationwide free-to-air networks and a few regional or local single transmitter stations. Analogue was phased out between September 2012 and December 2013 Digital Satellite pay TV is also available and carries most terrestrial networks. Freeview digital free satellite with a dozen SD channels, with SD feeds of the terrestrial HD freeview channels. Freeview, free-to-air digital terrestrial HD and SD content. Cable TV is available in some urban areas with Vodafone's broadband services. See also: List of New Zealand television channels Televisions: 1.926 million (1997) Internet Internet Service Providers (ISPs): 36 (2000) Internet users: 2.11 million (2002) Fixed internet connections: 1.24 million (2013) Country code (Top level domain): .nz Telecommunications Development Levy The government charges a $50 million Telecommunications Development Levy annually to fund improvements to communications infrastructure such as the Rural Broadband Initiative. It is payable by telecommunications firms with an operating revenue of over $10 million, in proportion to their qualified revenue. See | American and New Zealand businesses. Chorus, which was split from Telecom (now Spark) in 2011, still owns the majority of the telecommunications infrastructure, but competition from other providers has increased. A large-scale rollout of gigabit-capable fibre to the premises, branded as Ultra-Fast Broadband, began in 2009 with a target of being available to 87% of the population by 2022. , the United Nations International Telecommunication Union ranks New Zealand 13th in the development of information and communications infrastructure. Telephones Country calling code: 64 The same code is also used to reach Scott Base in Antarctica and the United States base McMurdo Station nearby. Mobile phone system Number of mobile connections: 4.7 million (2010) Coverage available to approx 97% of the population. Operators: 2degrees (operating UMTS and LTE) Virtual network operators: Warehouse Mobile (owned by The Warehouse Group) Spark New Zealand (operating UMTS, HSDPA and LTE) Virtual network operators: Skinny (owned by Spark NZ), Digital Island, Vocus (previously CallPlus) Compass Flexiroam, Vodafone New Zealand (operating GSM, UMTS, HSDPA and LTE) Virtual network operators: Flexiroam, Black+White, M2, Kogan Mobile NZ Fixed-line telephone system Number of fixed line connections: 1.92 million (2000) Individual lines available to 99% of residences. VoIP Cloud Based Voice services are now mainstream. Traditional Copper line Operators: Chorus Limited: A large numbers of ISPs (referred to as "retail service providers") retail Chorus' connections to personal and business customers. As a wholesaler, Chorus does not retail internet connections to end users. Cable and microwave links Domestic: Optical fibre and microwave links between cities Submarine optical fibre cables between the North Island and the South Island International: Submarine cables: Hawaiki Cable (launched July 2018) Southern Cross Cable (to Australia and Hawaii) TASMAN 2 (to Australia) Tasman Global Access (to Australia, completed March 2017) Moana Cable (proposed) Satellite earth stations: 2 Intelsat (Pacific Ocean) Radio Radio broadcast stations: AM 124, FM 290, shortwave 4 (1998), 4 on Freeview digital satellite. See also: List of radio stations in New Zealand Radios: 3.75 million (1997) Television Television broadcast stations: 41 (plus |
limit on the open road is for cars and motorcycles, with the default limit in urban areas. Around of motorway and expressway in Waikato and the Bay of Plenty have a higher posted speed limit of . Speed limits of are also used in increments of , and the posted speed limit may be more than the allowed speed limit for a particular vehicle type. Speeds are often reduced to beside roadworks. Private landowners may set their own speed limits, for example , although these are not enforced by police of road authorities. The Land Transport Rule: Setting of Speed Limits (2017) allows road controlling authorities to set enforceable speed limits, including permanent speed limits, of less than 50 km/h on roads within their jurisdiction. Road safety Total road deaths in New Zealand are high by developed country standards. 2010 figures from the International Transport Forum placed New Zealand 25th out of 33 surveyed countries in terms of road deaths per capita, a rank that has changed little in 30 years. The fatality rate per capita is twice the level of Germany's, or that of the United Kingdom, Sweden or the Netherlands (2010 comparison). This is variously blamed on aggressive driving, insufficient driver training, old and unsafe cars, inferior road design and construction, and a lack of appreciation of the skill and responsibility required to safely operate a motor vehicle. In 2010, 375 'road users' were killed in New Zealand, while 14,031 were injured, with 15- to 24-year-olds the group at highest risk. The three most common vehicle movements resulting in death or injury were "head-on collisions (while not overtaking)", "loss of control (on straight)" and "loss of control (while cornering)". In terms of deaths per 10,000 population, the most dangerous areas were the Waitomo District (121 deaths) and the Mackenzie District (110). Larger cities were comparatively safe, with Auckland City (28), Wellington (22) and Christchurch (28), while Dunedin had a higher rate of 43. New Zealand has a large number of overseas drivers (tourists, business, students and new immigrants), as well as renting campervans/motorhomes/RV's during the New Zealand summer. Overseas licensed drivers are significantly more likely to be found at fault in a collision in which they are involved (66.9%), compared to fully licensed New Zealand drivers (51.9%), and only slightly less likely to be found at fault than restricted (novice) New Zealand drivers (68.9%). Drunk driving is a major issue in New Zealand, especially among young drivers. New Zealand has relatively low penalties for drunk driving. In the late 2000s, reports indicated that the rate of drunk driving by under 20s in Auckland had risen 77% in three years, with similar increases in the rest of the country. Many drunk drivers already had convictions for previous drunk driving. The road toll has decreased over the 5 years from 421 in 2007 to 284 in 2011 In the 'Safer Journeys' Strategy, intended to guide road safety developments between 2010 and 2020, the Ministry of Transport aims for a 'safe systems' approach, prioritised four areas, being "Increasing the safety of young drivers", "Reducing alcohol/drug impaired driving", "Safe roads and roadsides" and "Increasing the safety of motorcycling". Funding Historically, most roads in New Zealand were funded by local road authorities (often road boards) who derived their income from local rates. As the need for new roads was often most urgent in those parts of the country where little rate income could yet be collected, the funding was at least partly dependent on national-level subsidies, for which much lobbying was undertaken. Many acts and ordinances were passed in the first decades of the colony, but lack of funds and parochialism (the desire to spend locally raised money locally, rather than use it to link different provinces) hindered the growth of the road network. This lack of larger-scale planning eventually led to increased public works powers given to the Central Government. Today, all funding for state highways and around 50% of funding for local roads comes directly from road users through the National Land Transport Fund. Road user revenue directed to the fund includes all fuel excise duty on LPG and CNG, around 55% of revenue from fuel excise duty on petrol, all revenue from road user charges (a prepaid distance/weight licence that all vehicles over 3.5 tonnes, and all non-petrol/LPG/CNG vehicles are liable to pay) and most non-ACC revenue from motor vehicle registration and licensing fees. In addition, in the last three years the government has increasingly allocated additional funds to land transport, to the extent that today the total expenditure by the NZ Transport Agency on land transport projects exceeds road tax revenue collected. The remainder of funding for local city and district roads primarily comes from local authority property rates. As of 2010, transport funding in New Zealand is still heavily biased towards road projects – the National government proposes to spend $21 billion on roading infrastructure after 2012, yet only $0.7 billion on other transport projects (public transport, walking and cycling). This has been criticised by opponents of the current government strategy as irresponsible, in light of increasing fuel prices and congestion. Government has claimed that their priority on roads is in line with New Zealanders' favoured travel modes, and as being the most promising in terms of economic benefits. Vehicle fleet One of the earliest counts/estimates of motor vehicles in New Zealand had them at 82,000 in 1925. This soon increased to 170,000 on the eve of World War II in 1939, continuing to 425,000 in 1953 and increasing to 1,000,000 in 1971. In the first national vehicle registration of 1925, 99,233 plates were issued. In 1931 156,180 motor-vehicles were registered and those licensed were 298,586 in 1939 and 380,503 in 1950. Just over half of the light passenger vehicles first registered in New Zealand are used imports. In 2013 new car registrations were up 7% on 2012 to 82,235 sold, with used vehicle sales up to 98,971. At the 2013 New Zealand census, 92.1 percent of households reported owning at least one car; 37.6 percent reported owning one car, 38.4 percent reported as owning two cars, and 16.1 percent reported owing three or more cars. Car ownership was highest in the Tasman Region (95.9 percent) and lowest in the Wellington Region (88.3 percent). In 2015, 3.018 million were light passenger vehicles, 507,000 were light commercial vehicles, 137,000 were heavy trucks, 10,000 were buses and 160,000 were motorcycles and mopeds. The mean age of a New Zealand car (as of end of 2015) was 14.2 years, with trucks at 17.6 years. 38% of light vehicles in 2017 were 15 years +, 171,000 being deregistered, but 334,000 added. By 2017 there were 792 light vehicles per 1,000 people, one of the highest vehicle ownerships in the world and they covered 9,265 km/capita. Average engine capacity of light vehicles grew to 2010 and was about 2,290cc in 2017, with average CO2 emissions about 180 g/km. Freight Freight tonne-km in 2017, were up 7.3% to 25.3 billion tkm from 23.6 billion tkm in 2016. The modal share of freight operations in 2017/18 was - Passenger services Without rapid transit, transport by bus services form the main component of public transport services in New Zealand cities, and the country also has a network of long-distance bus or coach services, augmented by door-to-door inter-city shuttle vans, a type of shared taxi. The first widespread motor vehicle services were shared taxi services termed service cars; a significant early provider was Aard, operating elongated Hudson Super Sixes. By 1920 AARD covered most of the North Island and even provided transport for the Prince of Wales. By 1924 the services covered even more areas. Aard was taken over by New Zealand Railways Road Services in 1928. The road fleet of New Zealand Railways Corporation was privatised in 1991 with the long-distance business still existing as InterCity, having more recently incorporated Newmans Coachlines. Another former extensive coach business was Mount Cook Landlines, which closed in the 1990s. Internet-based nakedbus.com is building another nationwide network, partly as a reseller of several smaller bus operators' capacity. Intercity and Tourism Holdings Ltd are significant sightseeing / tourism coach operators. Cycling While relatively popular for sport and recreation, bicycle use is a very marginal commuting mode, with the percentage share hovering around 1% in many major cities, and around 2% nationwide (2000s figures). This is primarily due to safety fears. For instance Auckland Regional Transport Authority reports that "over half of Aucklanders believe it is usually unsafe, or always unsafe, to cycle". The high risk to bicycle users is due to a number of factors. Motorists tend to exhibit hostile attitudes towards bicycle riders. Bicycles are classed as 'vehicles', a transport class legally obliged to use the road, forcing bicycle users to mingle with heavy and fast-moving motor vehicles; only postal workers are legally permitted to ride on footpaths. Bicycle infrastructure and the standards underpinning bicycle infrastructure planning are poor and bicycles receive relatively very low levels of funding by both central and local government. It has also been argued that the introduction of New Zealand's compulsory bicycle helmet law contributed to the decline in the popularity of cycling. Rail transport Network There is a total of 3,898 km of railway line in New Zealand, built to the narrow gauge of . Of this, 506 km is electrified. The national network's land is owned by New Zealand Railways Corporation, and the network owner and major rail transport operator is the state-owned enterprise KiwiRail. The national network consists of three main trunk lines, seven secondary main lines and during its peak in the 1950s, around ninety branch lines. The majority of the latter are now closed. Most lines were constructed by government but a few were of private origin, later nationalised. In 1931, the Transport Licensing Act was passed, protecting the railways from competition for fifty years. The Railways Corporation was created in 1983 from the New Zealand Railways Department, and the land transport industry became fully deregulated in 1983. Between 1982 and 1993 the rail industry underwent a major overhaul involving corporatisation, restructuring, downsizing, line and station closures and privatisation. In 1991 the Railways Corporation was split up, with New Zealand Rail Limited established to operate the rail and inter-island ferry services and own the rail network, with the parcels and bus services sold to private investors. The Railways Corporation continued to own the land underneath the rail network, as well as significant property holdings that were disposed of. In 1993 New Zealand Rail was itself privatised and was listed by its new owners in 1995, and renamed Tranz Rail. The Government agreed to take over control of the national rail network back when Toll Holdings purchased Tranz Rail in 2003, under the auspices of ONTRACK, a division of the Railways Corporation. In May 2008 the Government agreed to buy Toll NZ's rail and ferry operations for $665 million, and renamed the operating company KiwiRail. New Zealand has no rapid transit metro lines. Operators and services Bulk freights dominate services, particularly coal, logs and wood products, milk and milk products, fertiliser, containers, steel and cars. Long distance passenger services are limited to three routes – the TranzAlpine (Christchurch – Greymouth), the TranzCoastal (Christchurch – Picton) and the Northern Explorer (Wellington – Auckland). Urban rail services operate in Wellington and Auckland, and interurban services run between Palmerston North and Wellington (the Capital Connection) and Masterton and Wellington (the Wairarapa Connection). For most of its history, New Zealand's rail services were operated by the Railways Department. In 1982, the Department was corporatised as the New Zealand Railways Corporation. The Corporation was split in 1990 between a limited liability operating company, New Zealand Rail Limited, and the Corporation which retained a number of assets to be disposed. New Zealand Rail was privatised in 1993, and renamed Tranz Rail in 1995. In 2001, Tranz Rail's long-distance passenger operations, under the guise of Tranz Scenic, became a separate company; Tranz Rail chose not to bid for the contract to run Auckland's rail services, and the contract was won by Connex (now Transdev Auckland). Proposals to sell Tranz Rail's Wellington passenger rail services, Tranz Metro, did not come to fruition, although the division became a separate company in July 2003. In 2003 Tranz Rail was purchased by Australian freight firm Toll Holdings, which renamed the company Toll NZ. The only other significant non-heritage operator is the tourist oriented Dunedin Railways in Otago, which runs regular passenger trains on part of the former Otago Central Railway and some on the Main South Line. On 20 April 2020 the company announced that due to the COVID-19 pandemic, it mothballed its track and equipment. Heritage The | the environs of Wellington and opened in 1950, between Takapu Road and Johnsonville. Following heavy investment in road construction from the 1950s onwards, public transport patronage fell nationwide. This has been described, in Auckland's case, as "one of the most spectacular declines in public transport patronage of any developed city in the world". Network New Zealand has a state highway network of ( in the North Island and in the South Island, as of August 2006) of which are motorways. These link to of local authority roads, both paved and unpaved. The state highways carry 50% of all New Zealand road traffic, with the motorways alone carrying 9% of all traffic (even though they represent only 3% of the whole state highway network, and even less of the whole road network). Speed limits The default maximum speed limit on the open road is for cars and motorcycles, with the default limit in urban areas. Around of motorway and expressway in Waikato and the Bay of Plenty have a higher posted speed limit of . Speed limits of are also used in increments of , and the posted speed limit may be more than the allowed speed limit for a particular vehicle type. Speeds are often reduced to beside roadworks. Private landowners may set their own speed limits, for example , although these are not enforced by police of road authorities. The Land Transport Rule: Setting of Speed Limits (2017) allows road controlling authorities to set enforceable speed limits, including permanent speed limits, of less than 50 km/h on roads within their jurisdiction. Road safety Total road deaths in New Zealand are high by developed country standards. 2010 figures from the International Transport Forum placed New Zealand 25th out of 33 surveyed countries in terms of road deaths per capita, a rank that has changed little in 30 years. The fatality rate per capita is twice the level of Germany's, or that of the United Kingdom, Sweden or the Netherlands (2010 comparison). This is variously blamed on aggressive driving, insufficient driver training, old and unsafe cars, inferior road design and construction, and a lack of appreciation of the skill and responsibility required to safely operate a motor vehicle. In 2010, 375 'road users' were killed in New Zealand, while 14,031 were injured, with 15- to 24-year-olds the group at highest risk. The three most common vehicle movements resulting in death or injury were "head-on collisions (while not overtaking)", "loss of control (on straight)" and "loss of control (while cornering)". In terms of deaths per 10,000 population, the most dangerous areas were the Waitomo District (121 deaths) and the Mackenzie District (110). Larger cities were comparatively safe, with Auckland City (28), Wellington (22) and Christchurch (28), while Dunedin had a higher rate of 43. New Zealand has a large number of overseas drivers (tourists, business, students and new immigrants), as well as renting campervans/motorhomes/RV's during the New Zealand summer. Overseas licensed drivers are significantly more likely to be found at fault in a collision in which they are involved (66.9%), compared to fully licensed New Zealand drivers (51.9%), and only slightly less likely to be found at fault than restricted (novice) New Zealand drivers (68.9%). Drunk driving is a major issue in New Zealand, especially among young drivers. New Zealand has relatively low penalties for drunk driving. In the late 2000s, reports indicated that the rate of drunk driving by under 20s in Auckland had risen 77% in three years, with similar increases in the rest of the country. Many drunk drivers already had convictions for previous drunk driving. The road toll has decreased over the 5 years from 421 in 2007 to 284 in 2011 In the 'Safer Journeys' Strategy, intended to guide road safety developments between 2010 and 2020, the Ministry of Transport aims for a 'safe systems' approach, prioritised four areas, being "Increasing the safety of young drivers", "Reducing alcohol/drug impaired driving", "Safe roads and roadsides" and "Increasing the safety of motorcycling". Funding Historically, most roads in New Zealand were funded by local road authorities (often road boards) who derived their income from local rates. As the need for new roads was often most urgent in those parts of the country where little rate income could yet be collected, the funding was at least partly dependent on national-level subsidies, for which much lobbying was undertaken. Many acts and ordinances were passed in the first decades of the colony, but lack of funds and parochialism (the desire to spend locally raised money locally, rather than use it to link different provinces) hindered the growth of the road network. This lack of larger-scale planning eventually led to increased public works powers given to the Central Government. Today, all funding for state highways and around 50% of funding for local roads comes directly from road users through the National Land Transport Fund. Road user revenue directed to the fund includes all fuel excise duty on LPG and CNG, around 55% of revenue from fuel excise duty on petrol, all revenue from road user charges (a prepaid distance/weight licence that all vehicles over 3.5 tonnes, and all non-petrol/LPG/CNG vehicles are liable to pay) and most non-ACC revenue from motor vehicle registration and licensing fees. In addition, in the last three years the government has increasingly allocated additional funds to land transport, to the extent that today the total expenditure by the NZ Transport Agency on land transport projects exceeds road tax revenue collected. The remainder of funding for local city and district roads primarily comes from local authority property rates. As of 2010, transport funding in New Zealand is still heavily biased towards road projects – the National government proposes to spend $21 billion on roading infrastructure after 2012, yet only $0.7 billion on other transport projects (public transport, walking and cycling). This has been criticised by opponents of the current government strategy as irresponsible, in light of increasing fuel prices and congestion. Government has claimed that their priority on roads is in line with New Zealanders' favoured travel modes, and as being the most promising in terms of economic benefits. Vehicle fleet One of the earliest counts/estimates of motor vehicles in New Zealand had them at 82,000 in 1925. This soon increased to 170,000 on the eve of World War II in 1939, continuing to 425,000 in 1953 and increasing to 1,000,000 in 1971. In the first national vehicle registration of 1925, 99,233 plates were issued. In 1931 156,180 motor-vehicles were registered and those licensed were 298,586 in 1939 and 380,503 in 1950. Just over half of the light passenger vehicles first registered in New Zealand are used imports. In 2013 new car registrations were up 7% on 2012 to 82,235 sold, with used vehicle sales up to 98,971. At the 2013 New Zealand census, 92.1 percent of households reported owning at least one car; 37.6 percent reported owning one car, 38.4 percent reported as owning two cars, and 16.1 percent reported owing three or more cars. Car ownership was highest in the Tasman Region (95.9 percent) and lowest in the Wellington Region (88.3 percent). In 2015, 3.018 million were light passenger vehicles, 507,000 were light commercial vehicles, 137,000 were heavy trucks, 10,000 were buses and 160,000 were motorcycles and mopeds. The mean age of a New Zealand car (as of end of 2015) was 14.2 years, with trucks at 17.6 years. 38% of light vehicles in 2017 were 15 years +, 171,000 being deregistered, but 334,000 added. By 2017 there were 792 light vehicles per 1,000 people, one of the highest vehicle ownerships in the world and they covered 9,265 km/capita. Average engine capacity of light vehicles grew to 2010 and was about 2,290cc in 2017, with average CO2 emissions about 180 g/km. Freight Freight tonne-km in 2017, were up 7.3% to 25.3 billion tkm from 23.6 billion tkm in 2016. The modal share of freight operations in 2017/18 was - Passenger services Without rapid transit, transport by bus services form the main component of public transport services in New Zealand cities, and the country also has a network of long-distance bus or coach services, augmented by door-to-door inter-city shuttle vans, a type of shared taxi. The first widespread motor vehicle services were shared taxi services termed service cars; a significant early provider was Aard, operating elongated Hudson Super Sixes. By 1920 AARD covered most of the North Island and even provided transport for the Prince of Wales. By 1924 the services covered even more areas. Aard was taken over by New Zealand Railways Road Services in 1928. The road fleet of New Zealand Railways Corporation was privatised in 1991 with the long-distance business still existing as InterCity, having more recently incorporated Newmans Coachlines. Another former extensive coach business was Mount Cook Landlines, which closed in the 1990s. Internet-based nakedbus.com is building another nationwide network, partly as a reseller of several smaller bus operators' capacity. Intercity and Tourism Holdings Ltd are significant sightseeing / tourism coach operators. Cycling While relatively popular for sport and recreation, bicycle use is a very marginal commuting mode, with the percentage share hovering around 1% in many major cities, and around 2% nationwide (2000s figures). This is primarily due to safety fears. For instance Auckland Regional Transport Authority reports that "over half of Aucklanders believe it is usually unsafe, or always unsafe, to cycle". The high risk to bicycle users is due to a number of factors. Motorists tend to exhibit hostile attitudes towards bicycle riders. Bicycles are classed as 'vehicles', a transport class legally obliged to use the road, forcing bicycle users to mingle with heavy and fast-moving motor vehicles; only postal workers are legally permitted to ride on footpaths. Bicycle infrastructure and the standards underpinning bicycle infrastructure planning are poor and bicycles receive relatively very low levels of funding by both central and local government. It has also been argued that the introduction of New Zealand's compulsory bicycle helmet law contributed to the decline in the popularity of cycling. Rail transport Network There is a total of 3,898 km of railway line in New Zealand, built to the narrow gauge of . Of this, 506 km is electrified. The national network's land is owned by New Zealand Railways Corporation, and the network owner and major rail transport operator is the state-owned enterprise KiwiRail. The national network consists of three main trunk lines, seven secondary main lines and during its peak in the 1950s, around ninety branch lines. The majority of the latter are now closed. Most lines were constructed by government but a few were of private origin, later nationalised. In 1931, the Transport Licensing Act was passed, protecting the railways from competition for fifty years. The Railways Corporation was created in 1983 from the New Zealand Railways Department, and the land transport industry became fully deregulated in 1983. Between 1982 and 1993 the rail industry underwent a major overhaul involving corporatisation, restructuring, downsizing, line and station closures and privatisation. In 1991 the Railways Corporation was split up, with New Zealand Rail Limited established to operate the rail and inter-island ferry services and own the rail network, with the parcels and bus services sold to private investors. The Railways Corporation continued to own the land underneath the rail network, as well as significant property holdings that were disposed of. In 1993 New Zealand Rail was itself privatised and was listed by its new owners in 1995, and renamed Tranz Rail. The Government agreed to take over control of the national rail network back when Toll Holdings purchased Tranz Rail in 2003, under the auspices of ONTRACK, a division of the Railways Corporation. In May 2008 the Government agreed to buy Toll NZ's rail and ferry operations for $665 million, and renamed the operating company KiwiRail. New Zealand has no rapid transit metro lines. Operators and services Bulk freights dominate services, particularly coal, logs and wood |
A prelude to what was to become the First Taranaki War and a period of conflict in the North Island until 1872. The newly formed New Zealand Parliament revised and expanded the Militia Ordinance, replacing it with the Militia Act 1858. Some of the main changes were clauses enabling volunteers to be included under such terms and conditions as the Governor may specify. The act also outlined the purposes under which Militia could be called upon, including invasion. Debates in Parliament had included expressions of concern about Russian naval expansion in the northern Pacific, pointed out that the sole naval defence consisted of one 24-gun frigate, and the time it would take for Britain to come to the colony's aid. British Imperial troops remained in New Zealand until February 1870, during the later stage of the New Zealand Wars, by which time settler units had replaced them. The Defence Act 1886 reclassified the militia as volunteers. These were the forerunners of the Territorials. Volunteers (1858–1909) Although there were informal volunteer units as early as 1845, the appropriate approval and regulation of the units did not occur until the Militia Act 1858. Those who signed up for these units were exempt from militia duty, but had to be prepared to serve anywhere in New Zealand. One of the earliest gazetted units (13 January 1859) was the Taranaki Volunteer Rifle Company. To the Volunteer Rifle Corps were added Volunteer Artillery Corps in mid-1859. The first of these Volunteer Artillery Corps were based in Auckland. By late 1859 the number of volunteer units was so great that Captain H C Balneavis was appointed Deputy Adjunct-General, based at Auckland. Colonial Defence Force (1862–1867) In 1863 the government passed the Colonial Defence Force Act 1862 creating the first Regular Force. This was to be a mounted body of not more than 500 troops, with both Maori and settlers, and costing no more than 30,000 pounds per annum. All were volunteers and expected to serve for three years. Formation of the first unit did not begin until early April 1863, with 100 men being sought at New Plymouth under Captain Atkinson. Hawke's Bay was to have the next unit. By late April, papers were reporting few had enlisted in New Plymouth. Formation of an Auckland unit under Colonel Nixon commenced in July and by the 14th had 30 men. Authorised units by July 1863 Commander: Major-General Galloway By October 1863 there was no Wairarapa-based defence force, and 50 were based in Wanganui. The Otago force had earlier been moved to Wellington, with further Otago volunteers heading for the Auckland and Hawke's Bay Units. The total Defence Force numbered 375 by 3 November 1863. In October 1864 the Government decided to reduce the numbers in the Colonial Defence Force to 75 with three units of 25 members each in Wellington, Hawkes Bay and Taranaki. By this time there were about 10,000 British Imperial troops in New Zealand, supplemented by about as many New Zealand volunteer and militia forces. There were calls, particularly from South Island papers, for the British Imperial troops to be replaced by local forces. Parliamentary debates in late 1864 also supported this view, especially as the cost of maintaining the Imperial troops was becoming a greater financial burden on the colony. Defence review, March 1865 At the request of the governor in January 1865 a formal statement on the defence of the colony was presented on 20 March 1865. This proposed an armed constabulary force supported by friendly natives, volunteer units, and militia as the case may require be established to take the place of the Imperial troops. The proposed force was to consist of 1,350 Europeans and 150 Maori – 1,500 in total. They were to be divided into 30 companies of 50 men each based as follows: The total Defence budget, which included purchasing a steamer for use on the Waikato, Patea, and Wanganui rivers, was 187,000 pounds per annum. The budget's focus was solely on internal conflict. The issue of external conflict did not begin to resurface until the following year, with thought being given again to coastal defences. The Colonial Defence Force was disbanded in October 1867 by the Armed Constabulary Act 1867. Its members transferred to the Armed Constabulary. Evolution of volunteers and militia From 1863 to 1867 Forest Ranger volunteer units were formed, tasked with searching out Maori war parties, acting as scouts, and protecting lines of communication. They arose out of the need to prevent ambushes and random attacks on civilians near forest areas. The Rangers were well armed and more highly paid. These units used guerrilla style tactics, moving through areas under cover of darkness and ambushing war parties. The Forest Rangers were disbanded on 1 October 1867. See New Zealand Police Alongside the militia and the British Imperial forces were the Armed Constabulary. The Armed Constabulary were formed in 1846 with the passage of the Armed Constabulary Ordinance. The Constabulary's role was both regular law enforcement and during the New Zealand Wars militia support. From 1867 to 1886 the Armed Constabulary were the only permanent force in New Zealand. In 1886 the militia functions of the Armed Constabulary were transferred to the New Zealand Permanent Militia by the Defence Act 1886. Lieutenant Colonel John Roberts was the Permanent Militia's first commander from January 1887 to his retirement in 1888. Defence Act 1909 The Defence Act 1909 replaced the Volunteer forces with a Territorial force and compulsory military training, a regime that remained until | case may require be established to take the place of the Imperial troops. The proposed force was to consist of 1,350 Europeans and 150 Maori – 1,500 in total. They were to be divided into 30 companies of 50 men each based as follows: The total Defence budget, which included purchasing a steamer for use on the Waikato, Patea, and Wanganui rivers, was 187,000 pounds per annum. The budget's focus was solely on internal conflict. The issue of external conflict did not begin to resurface until the following year, with thought being given again to coastal defences. The Colonial Defence Force was disbanded in October 1867 by the Armed Constabulary Act 1867. Its members transferred to the Armed Constabulary. Evolution of volunteers and militia From 1863 to 1867 Forest Ranger volunteer units were formed, tasked with searching out Maori war parties, acting as scouts, and protecting lines of communication. They arose out of the need to prevent ambushes and random attacks on civilians near forest areas. The Rangers were well armed and more highly paid. These units used guerrilla style tactics, moving through areas under cover of darkness and ambushing war parties. The Forest Rangers were disbanded on 1 October 1867. See New Zealand Police Alongside the militia and the British Imperial forces were the Armed Constabulary. The Armed Constabulary were formed in 1846 with the passage of the Armed Constabulary Ordinance. The Constabulary's role was both regular law enforcement and during the New Zealand Wars militia support. From 1867 to 1886 the Armed Constabulary were the only permanent force in New Zealand. In 1886 the militia functions of the Armed Constabulary were transferred to the New Zealand Permanent Militia by the Defence Act 1886. Lieutenant Colonel John Roberts was the Permanent Militia's first commander from January 1887 to his retirement in 1888. Defence Act 1909 The Defence Act 1909 replaced the Volunteer forces with a Territorial force and compulsory military training, a regime that remained until the late 1960s, with breaks from 1918 to 1921, 1930 to 194?, and 194? to 1948. Separate services (from 1909) ''See Royal New Zealand Navy, New Zealand Army, Royal New Zealand Air Force Independent New Zealand armed forces developed in the early twentieth century; the Royal New Zealand Navy was the last to emerge as an independent service in 1941. Prior to that time it had been the New Zealand Division of the Royal Navy. New Zealand forces served alongside the British and other Empire and Commonwealth nations in World War I and World War II. The fall of Singapore in 1942 showed that Britain could no longer protect its far-flung Dominions. Closer military ties were therefore necessary for New Zealand's defence. With United States entering the war, they were an obvious choice. Links with Australia had also been developed earlier; both nations sent troops to the Anglo-Boer War and New Zealand officer candidates had trained at Australia's Royal Military College, Duntroon since 1911, a practice that continues to this day. A combined Australian and New Zealand Army Corps (ANZAC) was formed for the Gallipoli campaign during World War I, and its exploits are key events in the military history of both countries. The NZDF came into existence under the Defence Act 1990. Under previous legislation, the three services were part of the Ministry of Defence. Post-1990, the Ministry of Defence is a separate, policy-making body under a Secretary of Defence, equal in status to the Chief of Defence Force. For the first time, two of the Deputy Chiefs of service, Navy and Air Force, one-star Commodores and Air Commodores, are women. Captain Melissa Ross was promoted to Commodore and appointed as Deputy Chief of Navy in December 2019, while in August 2019 Group Captain Carol Abraham was promoted to Air Commodore and appointed as Deputy Chief of Air Force. Another female officer, Colonel Helen Cooper, had previously held the post of Deputy Chief of Army though in an acting capacity without being promoted to the customary rank of Brigadier. In an unusual move, as of February 2020, the Deputy Chief of Army, Evan Williams, holds not just the customary rank of Brigadier but that of Major General, usually only held by the Chief of Army. Higher direction of the armed services A new HQNZDF facility was opened by Prime Minister Helen Clark in March 2007. The new facility on Aitken St in the Wellington CBD replaced the premises on Stout St that had been the headquarters of NZDF for nearly 75 years. The Aitken St facility initially was home to around 900 employees of the NZDF, the New Zealand Security Intelligence Service (NZSIS) and the New Zealand Ministry of Defence; the NZSIS moved across to Pipitea House in early 2013, and the NZDF were forced to vacate the Aitken St building after the 2016 Kaikoura earthquake, which seriously damaged the building. As of October 2017 it is undergoing demolition, with this scheduled to be completed in early 2018; HQNZDF functions having been moved into other buildings and facilities across the region. HQNZDF operates as the administrative and support headquarters for the New Zealand Defence Force, with operational forces under the separate administrative command and control of HQJFNZ. Joint Forces Headquarters The operational forces of the three services are directed from Headquarters Joint Forces New Zealand opposite Trentham Military Camp in Upper Hutt. HQ JFNZ was established at Trentham on 1 July 2001. From this building, a former NZ government computer centre that used to house the Army's Land Command, the Air Component Commander, Maritime Component Commander, and Land Component Commander exercise command over their forces. Commander Joint Forces New Zealand (COMJFNZ), controls all overseas operational deployments and most overseas exercises. Senior officers : The Defence Force created a joint-service corporate services organisation known as the Joint Logistics and Support Organisation (JLSO) in the 2000s, which later became Defence Shared Services. Following the establishment of Special Operations Command on 1 July 2015, the new position of Special Operations Component Commander was created. This officer reports to the Commander Joint Forces New Zealand, and is of equivalent status to the Maritime, Land and Air Component Commanders. Support for servicemen and women In recent years, the New Zealand Defence Force has implemented a policy of honoring veterans, and increased its support to still servicemen and women in a number of ways. This includes starting the Defence Force KiwiSaver Scheme, and appointing financial advisers to support the welfare of members. Branches Navy The Royal New Zealand Navy (RNZN) has 2,268 full-time and 543 part-time sailors. The RNZN possess two Anzac class frigates, developed in conjunction with Australia, based on the German MEKO 200 design. Nine other vessels are in use, consisting of patrol vessels and logistics vessels. In 2010, the RNZN completed the acquisition of seven new vessels: one large Multi-Role Vessel named HMNZS Canterbury, two Offshore Patrol Vessels, and two Inshore Patrol Vessels. All of these vessels were acquired under Project Protector, and were built to commercial, not naval, standards. Army The New Zealand Army has 4,637 full-time and 1,778 part-time troops. They are organised as light infantry and motorised infantry equipped with 102 Canadian-manufactured LAV III Light Armoured Vehicles (NZLAV). There are also armoured reconnaissance, artillery, logistic, communications, medical and intelligence elements. The New Zealand Special Air Service is the NZDF's special forces capability, which operates in both conventional warfare and counter-terrorist roles. The Corps and Regiments of the New Zealand Army include: Royal New Zealand Infantry Regiment Royal New Zealand Armoured Corps Royal Regiment of New Zealand Artillery Corps of Royal New Zealand Engineers Royal New Zealand Corps of Signals Royal New Zealand Army Logistic Regiment Royal New Zealand Army Medical Corps Royal New Zealand Army Nursing Corps Royal New Zealand Army Dental Corps Corps of Royal New Zealand Military Police New Zealand Intelligence Corps New Zealand Army Legal Services Air Force The Royal New Zealand Air Force (RNZAF) has 2,542 full-time and 285 part-time airmen and airwomen. The RNZAF consists of 51 aircraft, consisting of P-3 Orion maritime patrol aircraft and Lockheed C-130 Hercules and other transport aircraft. The NHIndustries NH90 operates in a medium-utility role, and the AgustaWestland A109 operates the light utility helicopter role, in addition to the main training platform. RNZAF primary flight training occurs in Beechcraft T-6 Texan IIs, before moving onto the Beechcraft King Air. The RNZAF does not have air combat capabilities following the retirement without replacement of its Air Combat Force of A-4 Skyhawks in December 2001. Overseas Deployments Foreign defence relations New Zealand states it maintains a "credible minimum force", although critics (including the New Zealand National Party while in opposition) maintain that the country's defence forces have fallen below this standard. With a claimed area of direct strategic concern that extends from Australia to Southeast Asia to the South Pacific, and with defence expenditures that total around 1% of GDP, New Zealand necessarily places substantial reliance on co-operating with other countries, particularly Australia. Acknowledging the need to improve its defence capabilities, the government in 2005 announced the Defence Sustainability Initiative allocating an additional NZ$4.6 billion over 10 years to modernise the country's defence equipment and infrastructure and increase its military personnel. The funding represented a 51% increase in defence spending since the Labour government took office in 1999. New Zealand is an active participant in multilateral peacekeeping. It has taken a leading role in peace-keeping in the Solomon Islands and the neighbouring island of Bougainville. New Zealand has contributed to United Nations and other peacekeeping operations in Angola, Cambodia, Somalia, Lebanon and the former Yugoslavia. It also participated in the Multilateral Interception Force in the Persian Gulf. New Zealand has an ongoing peacekeeping commitment to East Timor, where it participated in the INTERFET, UNTAET and UMAMET missions from 1999 to 2002. At one point over 1,000 NZDF personnel were in East Timor. The deployment included the vessels HMNZS Canturbury, Te Kaha and Endeavour, six Iroquois helicopters, two C-130 Hercules and an infantry battalion. In response to renewed conflict in 2006 more troops were deployed as part of an international force. New Zealand has participated in 2 NATO-led coalitions; SFOR in the Former Yugoslavia (until December 2004) and an ongoing one in Afghanistan (which took over from a US-led coalition in 2006). New Zealand also participated in the European Union EUFOR operation in the former Yugoslavia from December 2004 |
signed the Treaty of Versailles (1919) joined the League of Nations. Wellington trusted Conservative Party governments in London, but not Labour. When the British Labour Party took power in 1924 and 1929, the New Zealand government felt threatened by Labour's foreign policy because of its reliance upon the League of Nations. The League was distrusted and Wellington did not expect to see the coming of a peaceful world order under League auspices. What had been the Empire's most loyal Dominion became a dissenter as it opposed efforts the first and second British Labour governments to trust the League's framework of arbitration and collective security agreements. The governments of the Reform and United Parties between 1912 and 1935 followed a "realistic" foreign policy. They made national security a high priority, were skeptical of international institutions such as the League, and showed no interest on the questions of self-determination, democracy, and human rights. However the opposition Labour Party was more idealistic and proposed a liberal internationalist outlook on international affairs. From 1935 the First Labour Government showed a limited degree of idealism in foreign policy, for example opposing the appeasement of Nazi Germany and Japan. Second World War When World War II broke out in 1939, New Zealand whole-heartedly joined in the defence of Britain, with Prime Minister Michael Joseph Savage declaring that "where Britain goes, we go; where Britain stands, we stand". New Zealand soldiers served in North Africa, Italy and the Pacific, and airmen in England and the Pacific, throughout the war, even when New Zealand had concerns about invasion by the Japanese. Since 1945 During World War II the New Zealand government created a Department of External Affairs (now the Ministry of Foreign Affairs and Trade for the first time in 1943, taking control of foreign policy that had previously been run by the Dominions Office in London. In 1947 New Zealand ratified the 1931 Statute of Westminster with the Statute of Westminster Adoption Act 1947, which made New Zealand fully independent of Britain. The Fall of Singapore during World War II made New Zealand realise that she could no longer rely on Britain to defend the British Empire. New Zealand troops supported the British in the successful battle against Communist insurrection in Malaysia and maintained an air-force fighter squadron in Singapore, and later on Cyprus, again supporting British forces. New Zealand diplomats sought an alliance with the United States of America, and in 1951 adhered to the ANZUS Treaty between New Zealand, Australia and the US. In return for America's guarantee of protection, New Zealand felt obliged to support America in its wars, and New Zealand committed forces to the Korean War (1950-1953) under United Nations Command auspices and to the Vietnam War. By the 1970s, many New Zealanders began to feel uncomfortable with their country's support for the US, particularly in Vietnam and regarding the visits of nuclear-powered and armed United States Armed Forces warships. The Third Labour government (1972–1975) pulled New Zealand troops out of the Vietnam War and protested against French nuclear testing in the Pacific, at one stage sending a warship to act as disapproving witness to the tests. Britain's entry into the European Economic Community in 1973 forced New Zealand into a more independent role. The British move restricted New Zealand's trade access to its biggest market, and it sought new trading partners in Asia, America and the Middle East. Australia and New Zealand signed the free-trade Closer Economic Relations agreement in 1983. The election of the Fourth Labour Government in 1984 marked a new period of independent foreign policy. Nuclear-powered and nuclear-armed ships were banned from New Zealand waters, effectively removing New Zealand from the ANZUS pact. Immigration laws were liberalised, leading to a massive increase in immigration from Asia. The Fourth National Government (1990–1999) liberalised trade by removing most tariffs and import restrictions. In 2008, Minister of Foreign Affairs Winston Peters announced what he called "a seismic change for New Zealand’s foreign service", designed to remedy the country's "struggling to maintain an adequate presence on the international stage". Peters said that the Ministry would receive additional funding and increase the number of New Zealand diplomats serving abroad by 50%. However this policy was reversed following the 2008 General Election which brought the John Key-led Fifth National Government of New Zealand to power. Commonwealth of Nations New Zealand is a member state of the Commonwealth of Nations - as one of the original members, the Dominion of New Zealand was declared on 26 September 1907. The reigning monarch and head of state, currently Queen Elizabeth II, Queen of New Zealand is viceregally represented by the Governor-General of New Zealand. New Zealand has strong relations with most other Commonwealth countries and has High Commissioners and High Commissions in most of them. United Nations New Zealand was a founding member of the United Nations in 1945, and was in the first set of rotating non-permanent members of the United Nations Security Council. New Zealand Prime Minister Peter Fraser felt that in order for New Zealand to be secure in the South Pacific, it need to align itself with major world powers like the United States through some kind of organisation that could guarantee small powers a say in world affairs. After the Fall of Singapore during World War II it became clear that Britain was no longer able to protect New Zealand so the government decided that a policy of independent relations with a group of strong powers was the best way to defend New Zealand. Participation in international organisations New Zealand participates in the United Nations (UN); the World Trade Organization (WTO); World Bank; the International Monetary Fund (IMF); the Organisation for Economic Co-operation and Development (OECD); the International Energy Agency; the Asian Development Bank; the Pacific Islands Forum; the Secretariat of the Pacific Community; the Colombo Plan; Asia-Pacific Economic Cooperation (APEC); and the International Whaling Commission. New Zealand also actively participates as a member of the Commonwealth. Despite the 1985 rupture in the ANZUS military alliance, New Zealand has maintained good working relations with the United States and Australia on a broad array of international issues. In the past, New Zealand's geographic isolation and its agricultural economy's general prosperity minimised public interest in international affairs. However, growing global trade and other international economic events have made New Zealanders increasingly aware of their country’s dependence on unstable overseas markets. New Zealand governments strongly advocate free trade, especially in agricultural products, and the country belongs to the Cairns group of nations in the WTO. New Zealand's economic involvement with Asia has become increasingly important. New Zealand is a "dialogue partner" with the Association of Southeast Asian Nations (ASEAN), a member of the East Asia Summit and an active participant in APEC. As a charter member of the Colombo Plan, New Zealand has provided Asian countries with technical assistance and capital. It also contributes through the Asian Development Bank and through UN programs and is a member of the UN Economic and Social Council for Asia and the Pacific. Summary of international organisation participation ABEDA, ANZUS (U.S. suspended security obligations to NZ on 11 August 1986), APEC, ARF (dialogue partner), AsDB, ASEAN (dialogue partner), Australia Group, Commonwealth, CP, EBRD, ESCAP, FAO, IAEA, IBRD, ICAO, ICC, ICCt, ICFTU, ICRM, IDA, IEA, IFAD, IFC, IFRCS, IHO, ILO, IMF, IMO, Interpol, IOC, IOM, ISO, ITU, NAM (guest), NSG, OECD, OPCW, PCA, PIF, Sparteca, SPC, UN, UNAMSIL, UNCTAD, UNESCO, UNHCR, UNIDO, UNMIK, UNMISET, UNMOP, UNTSO, UPU, WCO, WFTU, WHO, WIPO, WMO, WTO Overseas territories New Zealand administers Tokelau (formerly known as the Tokelau Islands) as a non-self-governing colonial territory. In February 2006 a UN-sponsored referendum was held in Tokelau on whether to become a self-governing state, but this failed to achieve the two-thirds majority required to pass. Samoa was a New Zealand protectorate from 1918 to full independence in 1962. However New Zealand retains some responsibilities for former colonies Niue and the Cook Islands which are in free association with New Zealand. Citizens of all three countries hold New Zealand citizenship and the associated rights to healthcare and education in New Zealand. New Zealand has also claimed part of Antarctica known as the Ross Dependency since 1923. Trade McGraw argues that, "Probably the greatest foreign policy achievement of [Helen] Clark's [1999–2008] term was the conclusion of a free trade agreement with China." Clark's government also set up a free-trade deal with Australia and the ten nations of ASEAN (the Association of South East Asian Nations). New Zealand has existing free trade agreements with Australia, Brunei, Chile, the People's Republic of China, Hong Kong, Singapore, and Thailand; new free trade agreements are under negotiation with ASEAN, and Malaysia. New Zealand is involved in the WTO's Doha Development Agenda and was disappointed by the failure of the most recent talks in July 2006. New Zealand is a signatory of the Trans Pacific Partnership. Its Labour-NZ First coalition government has committed to initiate a Closer Commonwealth Economic Relations (CCER) agreement with the UK, Australia, Canada and other countries and to work towards a Free Trade Agreement with the Russia-Belarus-Kazakhstan Customs Union. New Zealand's main export is food, primarily dairy products, meat, fruit and fish; about 95% of the country's agricultural produce is exported. Other major exports are wood, and mechanical and electrical equipment. About 46% of exports are non-agricultural, but the largest industry is still the food industry. Tourism is also an extremely important component of international trade: transport and travel form around 20% of the country's export trade. New Zealand does not have large quantities of mineral resources, though it does produce some coal, oil, aluminium and natural gas. New Zealand's largest source of imports is China, followed by (in order) Australia, the United States, Japan, and Singapore. The largest destinations for exports are, in order, Australia, China, the U.S., Japan, and South Korea. Trade figures for 2011 with New Zealand's biggest trade partners are as follows: Military Given its geography, New Zealand faces no immediate threat to its territorial integrity and its defense posture, and limited financial capability, reflects this. The New Zealand Defence Force is small compared to many other countries and its lacks air combat capability, although its army is generally regarded as very professional. Its overseas duties consist mostly of peacekeeping, especially in the Pacific, with wider regional security falling to Australia. In the 21st century, peacekeeping detachments have been deployed to East Timor, the Solomon Islands, and Tonga. Engineering and support forces have also been involved in the Iraq War, although New Zealand is not a member of the 'coalition of the willing'. New Zealand's heaviest military involvement in recent decades has been in Afghanistan following the United States-led invasion of that country after the 9/11 attacks. The deployment has included SAS troops. In February 2021 the MFAT confirmed granting export permits for military equipment to be sold to the Armed Forces of Saudi Arabia in the years 2016 and 2018, respectively. Documents obtained under the Official Information Act showed detailed transactions of the military export. The revelation was followed by a previous revelation of the business unit of Air New Zealand aiding the Royal Saudi Navy on a contractual basis, breaching its obligations towards human rights. The case of Air New Zealand’s business unit | Canada and other countries and to work towards a Free Trade Agreement with the Russia-Belarus-Kazakhstan Customs Union. New Zealand's main export is food, primarily dairy products, meat, fruit and fish; about 95% of the country's agricultural produce is exported. Other major exports are wood, and mechanical and electrical equipment. About 46% of exports are non-agricultural, but the largest industry is still the food industry. Tourism is also an extremely important component of international trade: transport and travel form around 20% of the country's export trade. New Zealand does not have large quantities of mineral resources, though it does produce some coal, oil, aluminium and natural gas. New Zealand's largest source of imports is China, followed by (in order) Australia, the United States, Japan, and Singapore. The largest destinations for exports are, in order, Australia, China, the U.S., Japan, and South Korea. Trade figures for 2011 with New Zealand's biggest trade partners are as follows: Military Given its geography, New Zealand faces no immediate threat to its territorial integrity and its defense posture, and limited financial capability, reflects this. The New Zealand Defence Force is small compared to many other countries and its lacks air combat capability, although its army is generally regarded as very professional. Its overseas duties consist mostly of peacekeeping, especially in the Pacific, with wider regional security falling to Australia. In the 21st century, peacekeeping detachments have been deployed to East Timor, the Solomon Islands, and Tonga. Engineering and support forces have also been involved in the Iraq War, although New Zealand is not a member of the 'coalition of the willing'. New Zealand's heaviest military involvement in recent decades has been in Afghanistan following the United States-led invasion of that country after the 9/11 attacks. The deployment has included SAS troops. In February 2021 the MFAT confirmed granting export permits for military equipment to be sold to the Armed Forces of Saudi Arabia in the years 2016 and 2018, respectively. Documents obtained under the Official Information Act showed detailed transactions of the military export. The revelation was followed by a previous revelation of the business unit of Air New Zealand aiding the Royal Saudi Navy on a contractual basis, breaching its obligations towards human rights. The case of Air New Zealand’s business unit The Gas Turbines aiding Royal Saudi navy was commissioned in early April 2021 by the Ministry of Foreign Affairs and Trade to be reviewed by a former executive of Ministry of Business, Innovation and Employment. The contractual arrangement between the two was criticized following the Arab nation’s role in the Yemen war. It was reported that the UN had expressed concerns regarding any military exports made to Saudi could possibly be used in the Yemeni conflict, despite which the MFAT sanctioned exports to the country, inviting scrutiny over New Zealand’s relations with Saudi Arabia. Foreign aid New Zealand's official aid programme is managed by the New Zealand Agency for International Development (NZAID), a semi-autonomous body within the Ministry of Foreign Affairs and Trade. In 2007, New Zealand was the sixth lowest foreign aid donor in the Organisation for Economic Co-operation and Development (OECD), based on proportion of gross national income (GNI) spent on overseas development assistance. New Zealand's contribution was 0.27% of GNI. Much this went to the Pacific region. However, the country is occasionally more generous in responding to major crises, for example donating around $100 million to the 2004 Indian Ocean tsunami relief efforts, the committed $1 million to the 2010 Haiti earthquake relief efforts, and later the government donated $2 million to the 2011 Japan earthquake and tsunami relief efforts. Following the April and May 2015 Nepal earthquake, the New Zealand Government sent an initial $1 million in humanitarian aid, and has mobilized 45 urban search and rescue technicians. New Zealand troops and aircraft are also often sent to disaster areas in the Asia-Pacific region. Nuclear free policy In the 1970s and 1980s, anti-nuclear sentiment increased across New Zealand fuelling concerns about French nuclear testing in the Pacific at Moruroa atoll. The third Labour Government under Norman Kirk, co-sponsored by Australia, took France before the International Court of Justice in 1972, requesting that the French cease atmospheric nuclear testing at Mururoa Atoll in French Polynesia in the southern Pacific Ocean. In 1972, as an act of defiance and protest the Kirk government sent two of its navy frigates, and into the Moruroa test zone area. Peace yachts attempting to disrupt the French tests had been sailing in coordinated protests into the Mururoa exclusion zones between 1972–1991. Concerns about Nuclear proliferation and the presence of nuclear warheads or reactors on United States Navy ships visiting New Zealand ports continued to escalate. After it was elected in 1984, the Labour Party government of David Lange indicated its opposition to visits by such ships. In February 1985, New Zealand turned away the and in response the United States announced that it was suspending its treaty obligations to New Zealand unless port access was restored. In 1987 the Labour government strengthened its stance by declaring New Zealand a nuclear-free zone (New Zealand Nuclear Free Zone, Disarmament, and Arms Control Act 1987), effectively legally removing New Zealand from the nuclear deterrent scenario and banning the entry of nuclear powered warships into its ports. Warships that did not fall into this category were not blocked, but the US took the view that any subsequent visit by a warship to New Zealand could not be carried out without violating the US' security policy of "neither confirming nor denying" nuclear capability of its ships. In 1987, New Zealand passed legislation making the country a nuclear free zone, namely the New Zealand Nuclear Free Zone, Disarmament, and Arms Control Act; in the same year the US retaliated with the Broomfield Act, designating New Zealand as a "friend" rather than an "ally". Relations between New Zealand and the US have had several ups and downs since then. In recent years, some voices have suggested removing the anti-nuclear legislation, especially the ACT New Zealand political party; and up until February 2006 the National Party was in favour of holding a referendum on the issue. However, public opinion remains strongly in favour of the country's status as a nuclear free zone. In May 2006, US Assistant Secretary of State for East Asia and Pacific Affairs, Christopher Hill, described the disagreement between the US and New Zealand as "a relic" but also signalled that the US wanted a closer defence relationship with New Zealand and praised New Zealand’s involvement in Afghanistan and reconstruction in Iraq. "Rather than trying to change each other's minds on the nuclear issue, which is a bit of a relic, I think we should focus on things we can make work," he told the Australian Financial Review. Pressure from the United States on New Zealand's foreign policy increased in 2006, with U.S. trade officials linking the repeal of the ban of American nuclear ships from New Zealand's ports to a potential free trade agreement between the two countries. Relations between France and New Zealand were strained for two short periods in the 1980s and 1990s over the French nuclear tests at Moruroa and the bombing of the Rainbow Warrior in Auckland harbour. The latter was widely regarded as an act of state terrorism against New Zealand's sovereignty and was ordered by then French President François Mitterrand, although he denied any involvement at the time. These events worked to strengthen New Zealand's resolve to retain its anti-nuclear policy. Relations between the two countries are now cordial, with strong trade and many new bilateral links. In 2017, New Zealand signed the United Nations Treaty on the Prohibition of Nuclear Weapons. Foreign Affairs Minister Gerry Brownlee said the treaty is "consistent with New Zealand's long-standing commitment to international nuclear disarmament efforts". Latin America New Zealand has well-established links to a number of Latin American countries, particularly in the economic sphere. New Zealand has Embassies in Mexico City, Santiago, Brasília and Buenos Aires – the first of which (Santiago) opened in 1972. The New Zealand Government's Latin America Strategy, published in May 2010, estimates New Zealand's annual exports to the region at NZ$1 billion, and New Zealand investments in the region (in areas such as agri-technology, energy, fisheries, and specialised manufacturing) at around NZ$1.3 billion. The Strategy argues that there is considerable scope to expand New Zealand's investment and services trade in the region. Focusing on six countries (Brazil, Mexico, Chile, Argentina, Uruguay and Peru), the Strategy posits that New Zealand should be seeking to: promote a better understanding of the region among New Zealand businesses to help identify prospects for increased investment, trade and joint ventures; lower barriers to business between New Zealand and Latin America; promote New Zealand tourism in the region; improve airlinks between New Zealand and the region; deepen education and research and science links. There are significant flows of tourists and students from Latin America to New Zealand. For example, in the year to June 2010, around 30,000 Latin Americans visited |
national parks, nature reserves, and biological reserves. The country had a 2019 Forest Landscape Integrity Index mean score of 3.63/10, ranking it 146th globally out of 172 countries. Geophysically, Nicaragua is surrounded by the Caribbean Plate, an oceanic tectonic plate underlying Central America and the Cocos Plate. Since Central America is a major subduction zone, Nicaragua hosts most of the Central American Volcanic Arc. Pacific lowlands In the west of the country, these lowlands consist of a broad, hot, fertile plain. Punctuating this plain are several large volcanoes of the Cordillera Los Maribios mountain range, including Mombacho just outside Granada, and Momotombo near León. The lowland area runs from the Gulf of Fonseca to Nicaragua's Pacific border with Costa Rica south of Lake Nicaragua. Lake Nicaragua is the largest freshwater lake in Central America (20th largest in the world), and is home to some of the world's rare freshwater sharks (Nicaraguan shark). The Pacific lowlands region is the most populous, with over half of the nation's population. The eruptions of western Nicaragua's 40 volcanoes, many of which are still active, have sometimes devastated settlements but also have enriched the land with layers of fertile ash. The geologic activity that produces vulcanism also breeds powerful earthquakes. Tremors occur regularly throughout the Pacific zone, and earthquakes have nearly destroyed the capital city, Managua, more than once. Most of the Pacific zone is tierra caliente, the "hot land" of tropical Spanish America at elevations under . Temperatures remain virtually constant throughout the year, with highs ranging between . After a dry season lasting from November to April, rains begin in May and continue to October, giving the Pacific lowlands of precipitation. Good soils and a favourable climate combine to make western Nicaragua the country's economic and demographic centre. The southwestern shore of Lake Nicaragua lies within of the Pacific Ocean. Thus the lake and the San Juan River were often proposed in the 19th century as the longest part of a canal route across the Central American isthmus. Canal proposals were periodically revived in the 20th and 21st centuries. Roughly a century after the opening of the Panama Canal, the prospect of a Nicaraguan ecocanal remains a topic of interest. In addition to its beach and resort communities, the Pacific lowlands contains most of Nicaragua's Spanish colonial architecture and artifacts. Cities such as León and Granada abound in colonial architecture; founded in 1524, Granada is the oldest colonial city in the Americas. North central highlands Northern Nicaragua is the most diversified region producing coffee, cattle, milk products, vegetables, wood, gold, and flowers. Its extensive forests, rivers and geography are suited for ecotourism. The central highlands are a significantly less populated and economically developed area in the north, between Lake Nicaragua and the Caribbean. Forming the country's tierra templada, or "temperate land", at elevations between , the highlands enjoy mild temperatures with daily highs of . This region has a longer, wetter rainy season than the Pacific lowlands, making erosion a problem on its steep slopes. Rugged terrain, poor soils, and low population density characterize the area as a whole, but the northwestern valleys are fertile and well settled. The area has a cooler climate than the Pacific lowlands. About a quarter of the country's agriculture takes place in this region, with coffee grown on the higher slopes. Oaks, pines, moss, ferns and orchids are abundant in the cloud forests of the region. Bird life in the forests of the central region includes resplendent quetzals, goldfinches, hummingbirds, jays and toucanets. Caribbean lowlands This large rainforest region is irrigated by several large rivers and is sparsely populated. The area has 57% of the territory of the nation and most of its mineral resources. It has been heavily exploited, but much natural diversity remains. The Rio Coco is the largest river in Central America; it forms the border with Honduras. The Caribbean coastline is much more sinuous than its generally straight Pacific counterpart; lagoons and deltas make it very irregular. Nicaragua's Bosawás Biosphere Reserve is in the Atlantic lowlands, part of which is located in the municipality of Siuna; it protects of La Mosquitia forest – almost 7% of the country's area – making it the largest rainforest north of the Amazon in Brazil. The municipalities of Siuna, Rosita, and Bonanza, known as the "Mining Triangle", are located in the region known as the North Caribbean Coast Autonomous Region, in the Caribbean lowlands. Bonanza still contains an active gold mine owned by HEMCO. Siuna and Rosita do not have active mines but panning for gold is still very common in the region. Nicaragua's tropical east coast is very different from the rest of the country. The climate is predominantly tropical, with high temperature and high humidity. Around the area's principal city of Bluefields, English is widely spoken along with the official Spanish. The population more closely resembles that found in many typical Caribbean ports than the rest of Nicaragua. A great variety of birds can be observed including eagles, toucans, parakeets and macaws. Other animal life in the area includes different species of monkeys, anteaters, white-tailed deer and tapirs. Flora and fauna Nicaragua is home to a rich variety of plants and animals. Nicaragua is located in the middle of the Americas and this privileged location has enabled the country to serve as host to a great biodiversity. This factor, along with the weather and light altitudinal variations, allows the country to harbor 248 species of amphibians and reptiles, 183 species of mammals, 705 bird species, 640 fish species, and about 5,796 species of plants. The region of great forests is located on the eastern side of the country. Rainforests are found in the Río San Juan Department and in the autonomous regions of RAAN and RAAS. This biome groups together the greatest biodiversity in the country and is largely protected by the Indio Maíz Biological Reserve in the south and the Bosawás Biosphere Reserve in the north. The Nicaraguan jungles, which represent about , are considered the lungs of Central America and comprise the second largest-sized rainforest of the Americas. There are currently 78 protected areas in Nicaragua, covering more than , or about 17% of its landmass. These include wildlife refuges and nature reserves that shelter a wide range of ecosystems. There are more than 1,400 animal species classified thus far in Nicaragua. Some 12,000 species of plants have been classified thus far in Nicaragua, with an estimated 5,000 species not yet classified. The bull shark is a species of shark that can survive for an extended period of time in fresh water. It can be found in Lake Nicaragua and the San Juan River, where it is often referred to as the "Nicaragua shark". Nicaragua has recently banned freshwater fishing of the Nicaragua shark and the sawfish in response to the declining populations of these animals. Government Politics of Nicaragua takes place in a framework of a presidential representative democratic republic, whereby the President of Nicaragua is both head of state and head of government, and of a multi-party system. Executive power is exercised by the government. Legislative power is vested in both the government and the national assembly. The judiciary makes up the third branch of government. Between 2007 and 2009, Nicaragua's major political parties discussed the possibility of going from a presidential system to a parliamentary system. Their reason: there would be a clear differentiation between the head of government (prime minister) and the head of state (president). Nevertheless, it was later argued that the true reason behind this proposal was to find a legal way for President Ortega to stay in power after January 2012, when his second and last government period was expected to end. Ortega was reelected to a third term in November 2016. Foreign relations Nicaragua pursues an independent foreign policy. Nicaragua is in territorial disputes with Colombia over the Archipelago de San Andrés y Providencia and Quita Sueño Bank and with Costa Rica over a boundary dispute involving the San Juan River. Military The armed forces of Nicaragua consists of various military contingents. Nicaragua has an army, navy and an air force. There are roughly 14,000 active duty personnel, which is much less compared to the numbers seen during the Nicaraguan Revolution. Although the army has had a rough military history, a portion of its forces, which were known as the national guard, became integrated with what is now the National Police of Nicaragua. In essence, the police became a gendarmerie. The National Police of Nicaragua are rarely, if ever, labeled as a gendarmerie. The other elements and manpower that were not devoted to the national police were sent over to cultivate the new Army of Nicaragua. The age to serve in the armed forces is 17 and conscription is not imminent. , the military budget was roughly 0.7% of Nicaragua's expenditures. In 2017, Nicaragua signed the UN treaty on the Prohibition of Nuclear Weapons. Law enforcement The National Police of Nicaragua Force (in Spanish: La Policía Nacional Nicaragüense) is the national police of Nicaragua. The force is in charge of regular police functions and, at times, works in conjunction with the Nicaraguan military, making it an indirect and rather subtle version of a gendarmerie. However, the Nicaraguan National Police work separately and have a different established set of norms than the nation's military. According to a recent US Department of State report, corruption is endemic, especially within law enforcement and the judiciary, and arbitrary arrests, torture, and harsh prison conditions are the norm. Nicaragua is the safest country in Central America and one of the safest in Latin America, according to the United Nations Development Program, with a homicide rate of 8.7 per 100,000 inhabitants. Administrative divisions Nicaragua is a unitary republic. For administrative purposes it is divided into 15 departments (departamentos) and two self-governing regions (autonomous communities) based on the Spanish model. The departments are then subdivided into 153 municipios (municipalities). The two autonomous regions are the North Caribbean Coast Autonomous Region and South Caribbean Coast Autonomous Region, often referred to as RACCN and RACCS, respectively. Economy Nicaragua is among the poorest countries in the Americas. Its gross domestic product (GDP) in purchasing power parity (PPP) in 2008 was estimated at US$17.37 billion. Agriculture represents 15.5% of GDP, the highest percentage in Central America. Remittances account for over 15% of the Nicaraguan GDP. Close to one billion dollars are sent to the country by Nicaraguans living abroad. The economy grew at a rate of about 4% in 2011. By 2019, given restrictive taxes and a civil conflict, it recorded a negative growth of - 3.9%; the International Monetary Fund forecast for 2020 is a further decline of 6% due to COVID-19. The restrictive tax measures put in place in 2019 and a political crisis over social security negatively affected the country's weak public spending and investor confidence in sovereign debt. According to the update IMF forecasts from 14 April 2020, due to the outbreak of the COVID-19, GDP growth is expected to fall to -6% in 2020. According to the United Nations Development Programme, 48% of the population of Nicaragua live below the poverty line, 79.9% of the population live with less than $2 per day, According to UN figures, 80% of the indigenous people (who make up 5% of the population) live on less than $1 per day. According to the World Bank, Nicaragua ranked as the 123rd out of 190 best economy for starting a business. In 2007, Nicaragua's economy was labelled "62.7% free" by the Heritage Foundation, with high levels of fiscal, government, labor, investment, financial, and trade freedom. It ranked as the 61st freest economy, and 14th (of 29) in the Americas. In March 2007, Poland and Nicaragua signed an agreement to write off 30.6 million dollars which was borrowed by the Nicaraguan government in the 1980s. Inflation reduced from 33,500% in 1988 to 9.45% in 2006, and the foreign debt was cut in half. Nicaragua is primarily an agricultural country; agriculture constitutes 60% of its total exports which annually yield approximately US$300 million. Nearly two-thirds of the coffee crop comes from the northern part of the central highlands, in the area north and east of the town of Estelí. Tobacco, grown in the same northern highlands region as coffee, has become an increasingly important cash crop since the 1990s, with annual exports of leaf and cigars in the neighborhood of $200 million per year. Soil erosion and pollution from the heavy use of pesticides have become serious concerns in the cotton district. Yields and exports have both been declining since 1985. Today most of Nicaragua's bananas are grown in the northwestern part of the country near the port of Corinto; sugarcane is also grown in the same district. Cassava, a root crop somewhat similar to the potato, is an important food in tropical regions. Cassava is also the main ingredient in tapioca pudding. Nicaragua's agricultural sector has benefited because of the country's strong ties to Venezuela. It is estimated that Venezuela will import approximately $200 million in agricultural goods. In the 1990s, the government initiated efforts to diversify agriculture. Some of the new export-oriented crops were peanuts, sesame, melons, and onions. Fishing boats on the Caribbean side bring shrimp as well as lobsters into processing plants at Puerto Cabezas, Bluefields, and Laguna de Perlas. A turtle fishery thrived on the Caribbean coast before it collapsed from overexploitation. Mining is becoming a major industry in Nicaragua, contributing less than 1% of gross domestic product (GDP). Restrictions are being placed on lumbering due to increased environmental concerns about destruction of the rain forests. But lumbering continues despite these obstacles; indeed, a single hardwood tree may be worth thousands of dollars. During the war between the US-backed Contras and the government of the Sandinistas in the 1980s, much of the country's infrastructure was damaged or destroyed. Transportation throughout the nation is often inadequate. For example, it was until recently impossible to travel all the way by highway from Managua to the Caribbean coast. A new road between Nueva Guinea and Bluefields was completed in 2019 and allows regular bus service to the capital. The Centroamérica power plant on the Tuma River in the Central highlands has been expanded, and other hydroelectric projects have been undertaken to help provide electricity to the nation's newer industries. Nicaragua has long been considered as a possible site for a new canal that could supplement the Panama Canal, connecting the Caribbean Sea (and therefore the Atlantic Ocean) with the Pacific Ocean. Nicaragua's minimum wage is among the lowest in the Americas and in the world. Remittances are equivalent to roughly 15% of the country's gross domestic product. Growth in the maquila sector slowed in the first decade of the 21st century with rising competition from Asian markets, particularly China. Land is the traditional basis of wealth in Nicaragua, with great fortunes coming from the export of staples such as coffee, cotton, beef, and sugar. Almost all of the upper class and nearly a quarter of the middle class are substantial landowners. A 1985 government study classified 69.4 percent of the population as poor on the basis that they were unable to satisfy one or more of their basic needs in housing, sanitary services (water, sewage, and garbage collection), education, and employment. The defining standards for this study were very low; housing was considered substandard if it was constructed of discarded materials with dirt floors or if it was occupied by more than four persons per room. Rural workers are dependent on agricultural wage labor, especially in coffee and cotton. Only a small fraction hold permanent jobs. Most are migrants who follow crops during the harvest period and find other work during the off-season. The "lower" peasants are typically smallholders without sufficient land to sustain a family; they also join the harvest labor force. The "upper" peasants have sufficient resources to be economically independent. They produce enough surplus, beyond their personal needs, to allow them to participate in the national and world markets. The urban lower class is characterized by the informal sector of the economy. The informal sector consists of small-scale enterprises that utilize traditional technologies and operate outside the legal regime of labor protections and taxation. Workers in the informal sector are self-employed, unsalaried family workers or employees of small-enterprises, and they are generally poor. Nicaragua's informal sector workers include tinsmiths, mattress makers, seamstresses, bakers, shoemakers, and carpenters; people who take in laundry and ironing or prepare food for sale in the streets; and thousands of peddlers, owners of small businesses (often operating out of their own homes), and market stall operators. Some work alone, but others labor in the small talleres (workshops/factories) that are responsible for a large share of the country's industrial production. Because informal sector earnings are generally very low, few families can subsist on one income. Like most Latin American nations Nicaragua is also characterized by a very small upper-class, roughly 2% of the population, that is very wealthy and wields the political and economic power in the country that is not in the hands of foreign corporations and private industries. These families are oligarchical in nature and have ruled Nicaragua for generations and their wealth is politically and economically horizontally and vertically integrated. Nicaragua is currently a member of the Bolivarian Alliance for the Americas, which is also known as ALBA. ALBA has proposed creating a new currency, the Sucre, for use among its members. In essence, this means that the Nicaraguan córdoba will be replaced with the Sucre. Other nations that will follow a similar pattern include: Venezuela, Ecuador, Bolivia, Honduras, Cuba, Saint Vincent and the Grenadines, Dominica and Antigua and Barbuda. Nicaragua is considering construction of a canal linking the Atlantic to the Pacific Ocean, which President Daniel Ortega has said will give Nicaragua its "economic independence." Scientists have raised concerns about environmental impacts, but the government has maintained that the canal will benefit the country by creating new jobs and potentially increasing its annual growth to an average of 8% per year. The project was scheduled to begin construction in December 2014, however the Nicaragua Canal has yet to be started. Tourism By 2006, tourism had become the second largest industry in Nicaragua. Previously, tourism had grown about 70% nationwide during a period of 7 years, with rates of 10%–16% annually. The increase and growth led to the income from tourism to rise more than 300% over a period of 10 years. The growth in tourism has also positively affected the agricultural, commercial, and finance industries, as well as the construction industry. President Daniel Ortega has stated his intention to use tourism to combat poverty throughout the country. The results for Nicaragua's tourism-driven economy have been significant, with the nation welcoming one million tourists in a calendar year for the first time in its history in 2010. Every year about 60,000 U.S. citizens visit Nicaragua, primarily business people, tourists, and those visiting relatives. Some 5,300 people from the U.S. reside in Nicaragua. The majority of tourists who visit Nicaragua are from the U.S., Central or South America, and Europe. According to the Ministry of Tourism of Nicaragua (INTUR), the colonial cities of León and Granada are the preferred spots for tourists. Also, the cities of Masaya, Rivas and the likes of San Juan del Sur, El Ostional, the Fortress of the Immaculate Conception, Ometepe Island, the Mombacho volcano, and the Corn Islands among other locations are the main tourist attractions. In addition, ecotourism, sport fishing and surfing attract many tourists to Nicaragua. According to the TV Noticias news program, the main attractions in Nicaragua for tourists are the beaches, the scenic routes, the architecture of cities such as León and Granada, ecotourism, and agritourism particularly in northern Nicaragua. As a result of increased tourism, Nicaragua has seen its foreign direct investment increase by 79.1% from 2007 to 2009. Nicaragua is referred to as "the land of lakes and volcanoes" due to the number of lagoons and lakes, and the chain of volcanoes that runs from the north to the south along the country's Pacific side. Today, only 7 of the 50 volcanoes in Nicaragua are considered active. Many of these volcanoes offer some great possibilities for tourists with activities such as hiking, climbing, camping, and swimming in crater lakes. The Apoyo Lagoon Natural Reserve was created by the eruption of the Apoyo Volcano about 23,000 years ago, which left a huge 7 km-wide crater that gradually filled with water. It is surrounded by the old crater wall. The rim of the lagoon is lined with restaurants, many of which have kayaks available. Besides exploring the forest around it, many water sports are practiced in the lagoon, most notably kayaking. Sand skiing has become a popular attraction at the Cerro Negro volcano in León. Both dormant and active volcanoes can be climbed. Some of the most visited volcanoes include the Masaya Volcano, Momotombo, Mombacho, Cosigüina and Ometepe's Maderas and Concepción. Ecotourism aims to be ecologically and socially conscious; it focuses on local culture, wilderness, and adventure. Nicaragua's ecotourism is growing with every passing year. It boasts a number of ecotourist tours and perfect places for adventurers. Nicaragua has three eco-regions (the Pacific, Central, and Atlantic) which contain volcanoes, tropical rainforests, and agricultural land. The majority of the eco-lodges and other environmentally-focused touristic destinations are found on Ometepe Island, located in the middle of Lake Nicaragua just an hour's boat ride from Granada. While some are foreign-owned, others are owned by local families. Demographics According to a 2014 research published in the journal Genetics and Molecular Biology, European ancestry predominates in 69% of Nicaraguans, followed by African ancestry in 20%, and lastly indigenous ancestry in 11%. A Japanese research of "Genomic Components in America's demography" demonstrated that, on average, the ancestry of Nicaraguans is 58–62% European, 28% Native American, and 14% African, with a very small Near Eastern contribution. Non-genetic data from the CIA World Factbook establish that from Nicaragua's 2016 population of 5,966,798, around 69% are mestizo, 17% white, 5% Native American, and 9% black and other races. This fluctuates with changes in migration patterns. The population is 58% urban . The capital Managua is the biggest city, with an estimated population of 1,042,641 in 2016. In 2005, over 5 million people lived in the Pacific, Central and North regions, and 700,000 in the Caribbean region. There is a growing expatriate community, the majority of whom move for business, investment or retirement from across the world, such as from the US, Canada, Taiwan, and European countries; the majority have settled in Managua, Granada and San Juan del Sur. Many Nicaraguans live abroad, particularly in Costa Rica, the United States, Spain, Canada, and other Central American countries. Nicaragua has a population growth rate of 1.5% . This is the result of one of the highest birth rates in the Western Hemisphere: 17.7 per 1,000 as of 2017. The death rate was 4.7 per 1,000 during the same period according to the United Nations. Ethnic groups The majority of the Nicaraguan population is composed of mestizos, roughly 69%, while 17% of Nicaragua's population is white, with the majority of them being of Spanish descent, while others are of German, Italian, English, Turkish, Danish or French ancestry. Black Creoles About 9% of Nicaragua's population is black and mainly resides on the country's Caribbean (or Atlantic) coast. The black population is mostly composed of black English-speaking Creoles who are the descendants of escaped or shipwrecked slaves; many carry the name of Scottish settlers who brought slaves with them, such as Campbell, Gordon, Downs, and Hodgson. Although many Creoles supported Somoza because of his close association with the United States, they rallied to the Sandinista cause in July 1979 only to reject the revolution soon afterwards in response to a new phase of 'westernization' and imposition of central rule from Managua. There is a smaller number of Garifuna, a people of mixed West African, Carib and Arawak descent. In the mid-1980s, the government divided the Zelaya Department – consisting of the eastern half of the country – into two autonomous regions and granted the black and indigenous people of this region limited self-rule within the republic. Indigenous population The remaining 5% of Nicaraguans are indigenous, the descendants of the country's original inhabitants. Nicaragua's pre-Columbian population consisted of many indigenous groups. In the western region, the Nahuas (Nicarao people) were present along with other groups such as the Chorotega people and the Subtiabas (also known as Maribios or Hokan Xiu). The central region and the Caribbean coast of Nicaragua were inhabited by indigenous peoples who were Macro-Chibchan language groups that had migrated to and from South America in ancient times, primarily what is now Colombia and Venezuela. These groups include the present-day Matagalpas, Miskitos, Ramas, as well as Mayangnas and Ulwas who are also known as Sumos. In the 19th century, there was a substantial indigenous minority, but this group was largely assimilated culturally into the mestizo majority. The Garifuna are also present, mainly on the Caribbean Coast. They are a people of mixed African and Indigenous descent. Languages Nicaraguan Spanish has many indigenous influences and several distinguishing characteristics. For example, some Nicaraguans have a tendency to replace /s/ with /h/ when speaking. Although Spanish is spoken throughout, the country has great variety: vocabulary, accents and colloquial language can vary between towns and departments. On the Caribbean coast, indigenous languages, English-based creoles, and Spanish are spoken. The Miskito language, spoken by the Miskito people as a first language and some other indigenous and Afro-descendants people as a second, third, or fourth language, is the most commonly spoken indigenous language. The indigenous Misumalpan languages of Mayangna and Ulwa are spoken by the respective peoples of the same names. Many Miskito, Mayangna, and Sumo people also speak Miskito Coast Creole, and a large majority also speak Spanish. Fewer than three dozen of nearly 2,000 Rama people speak their Chibchan language fluently, with nearly all Ramas speaking Rama Cay Creole and the vast majority speaking Spanish. Linguists have attempted to document and revitalize the language over the past three decades. The Garifuna people, descendants of indigenous and Afro-descendant people who came to Nicaragua from Honduras in the early twentieth century, have recently attempted to revitalize their Arawakan language. The majority speak Miskito Coast Creole as their first language and Spanish as their second. The Creole or Kriol people, descendants of enslaved Africans brought to the Mosquito Coast during the British colonial period and European, Chinese, Arab, and British West Indian immigrants, also speak Miskito Coast Creole as their first language and Spanish as their second. Largest cities Religion Religion plays a significant part of the culture of Nicaragua and is afforded special protections in the constitution. Religious freedom, which has been guaranteed since 1939, and religious tolerance are promoted by the government and the constitution. Nicaragua has no official religion. Catholic bishops are expected to lend their authority to important state occasions, and their pronouncements on national issues are closely followed. They can be called upon to mediate between contending parties at moments of political crisis. In 1979, Miguel D'Escoto Brockman, a priest who had embraced Liberation Theology, served in the government as foreign minister when the Sandinistas came to power. The largest denomination, and traditionally the religion of the majority, is the Roman Catholic Church. It came to Nicaragua in the 16th century with the Spanish conquest and remained, until 1939, the established faith. The number of practicing Roman Catholics has been declining, while membership of evangelical Protestant groups and The Church of Jesus Christ of Latter-day Saints (LDS Church) has been growing rapidly since the 1990s. There is a significant LDS missionary effort in Nicaragua. There are two missions and 95,768 members of the LDS Church (1.54% of the population). There are also strong Anglican and Moravian communities on the Caribbean coast in what once constituted the sparsely populated Mosquito Coast colony. It was under British influence for nearly three centuries. Protestantism was brought to the Mosquito Coast mainly by British and German colonists in forms of Anglicanism and the Moravian Church. Other kinds of Protestant and other Christian denominations were introduced to the rest of Nicaragua during the 19th century. Popular religion revolves around the saints, who are perceived as intercessors between human beings and God. Most localities, from the capital of Managua to small rural communities, honor patron saints, selected from the Roman Catholic calendar, with annual fiestas. In many communities, a rich lore has grown up around the celebrations of patron saints, such as Managua's Saint Dominic (Santo Domingo), honored in August with two colorful, often riotous, day-long processions through the city. The high point of Nicaragua's religious calendar for the masses is neither Christmas nor Easter, but La Purísima, a week of festivities in early December dedicated to the Immaculate Conception, during which elaborate altars to the Virgin Mary are constructed in homes and workplaces. Buddhism has increased with a steady influx of immigration. Although Jews have been living in Nicaragua since the 18th century, the Jewish population is small, numbering less than 200 people in 2017. Of these, 112 were recent converts who claimed Sephardic Jewish ancestry. As of 2007, approximately 1,200 to 1,500 Nicaraguan residents practiced Islam, most of them Sunnis who are resident aliens or naturalized citizens from Palestine, Libya, and Iran or natural-born Nicaraguan descendants of the two groups. Immigration Relative to its population, Nicaragua has not experienced large waves of immigration. The number of immigrants in Nicaragua, from other Latin American countries or other countries, never surpassed 1% of its total population before 1995. The 2005 census showed the foreign-born population at 1.2%, having risen a mere 0.06% in 10 years. In the 19th century, Nicaragua experienced modest waves of immigration from Europe. In particular, families from Germany, Italy, Spain, France and Belgium immigrated to Nicaragua, particularly the departments in the Central and Pacific region. Also present is a small Middle Eastern-Nicaraguan community of Syrians, Armenians, Jewish Nicaraguans, and Lebanese people in Nicaragua. This community numbers about 30,000. There is an East Asian community mostly consisting of Chinese, Taiwanese, and Japanese. The Chinese Nicaraguan population is estimated at around 12,000. The Chinese arrived in the late 19th century but were unsubstantiated until the 1920s. Diaspora The Civil War forced many Nicaraguans to start lives outside of their country. Many people emigrated during the 1990s and the first decade of the 21st century due to the lack of employment opportunities and poverty. The majority of the Nicaraguan Diaspora migrated to the United States and Costa Rica. Today one in six Nicaraguans live in these two countries. The diaspora has seen Nicaraguans settling around in smaller communities in other parts of the world, particularly Western Europe. Small communities of Nicaraguans are found in France, Germany, Italy, Spain, Norway, Sweden and the United Kingdom. Communities also exist in Australia and New Zealand. Canada, Brazil and Argentina host small groups of these communities. In Asia, Japan hosts a small Nicaraguan community. Due to extreme poverty at home, many Nicaraguans are now living and working in neighboring El Salvador, a country that has the US dollar as its currency. Healthcare Although Nicaragua's health outcomes have improved over the past few decades with the efficient utilization of resources relative to other Central American nations, healthcare in Nicaragua still confronts challenges responding to its populations' diverse healthcare needs. The Nicaraguan government guarantees universal free health care for its citizens. However, limitations of current delivery models and unequal distribution of resources and medical personnel contribute to the persistent lack of quality care in more remote areas of Nicaragua, especially among rural communities in the Central and Atlantic region. To respond to the dynamic needs of localities, the government has adopted a decentralized model that emphasizes community-based preventive and primary medical care. Education The adult literacy rate in 2005 was 78.0%. Primary education is free in Nicaragua. A system of private schools exists, many of which are religiously affiliated and often have more robust English programs. As of 1979, the educational system was one of the poorest in Latin America. One of the first acts of the newly elected Sandinista government in 1980 was an extensive and successful literacy campaign, using secondary school students, university students and teachers as volunteer teachers: it reduced the overall illiteracy rate from 50.3% to 12.9% within only five months. This was one of a number of large-scale programs which received international recognition for their gains in literacy, health care, education, childcare, unions, and land reform. The Sandinistas also added a leftist ideological content to the curriculum, which was removed after 1990. In September 1980, UNESCO awarded Nicaragua the Soviet Union sponsored Nadezhda Krupskaya award for the literacy campaign. Gender equality When it comes to gender equality in Latin America, Nicaragua ranks high among the other countries in the region. When it came to global rankings regarding gender equality, the World Economic Forum ranked Nicaragua at number twelve in 2015, and in its 2020 report Nicaragua ranked number five, behind only northern European countries. Nicaragua was among the many countries in Latin America and the Caribbean to ratify the Convention on the Elimination of All Forms of Discrimination against Women, which aimed to promote women's rights. In 2009, a Special Ombudsman for Sexual Diversity position was created within its Office of the Human Rights Ombudsman. And, in 2014, the Health Ministry in 2014 banned discrimination based on gender identity and sexual orientation. Nevertheless, discrimination against LGBTQ individuals is common, particularly in housing, education, and the workplace. The Human Development Report ranked Nicaragua 106 out of 160 countries in the Gender Inequality Index (GII) in 2017. It reflects gender-based inequalities in three dimensions - reproductive health, empowerment, and economic activity. Culture Nicaraguan culture has strong folklore, music and | associated with the Toltec civilization. Both Chorotegas and Nicaraos originated in Mexico's Cholula valley, and migrated south. A third group, the Subtiabas, were an Oto-Manguean people who migrated from the Mexican state of Guerrero around 1200 CE. Additionally, there were trade-related colonies in Nicaragua set up by the Aztecs starting in the 14th century. Spanish era (1523–1821) In 1502, on his fourth voyage, Christopher Columbus became the first European known to have reached what is now Nicaragua as he sailed southeast toward the Isthmus of Panama. Columbus explored the Mosquito Coast on the Atlantic side of Nicaragua but did not encounter any indigenous people. 20 years later, the Spaniards returned to Nicaragua, this time to its southwestern part. The first attempt to conquer Nicaragua was by the conquistador Gil González Dávila, who had arrived in Panama in January 1520. In 1522, González Dávila ventured to the area that later became the Rivas Department of Nicaragua. There he encountered an indigenous Nahua tribe led by chief Macuilmiquiztli, whose name has sometimes been erroneously referred to as "Nicarao" or "Nicaragua". The tribe's capital was Quauhcapolca. González Dávila conversed with Macuilmiquiztli thanks to two indigenous interpreters who had learned Spanish, whom he had brought along. After exploring and gathering gold in the fertile western valleys, González Dávila and his men were attacked and driven off by the Chorotega, led by chief Diriangén. The Spanish tried to convert the tribes to Christianity; Macuilmiquiztli's tribe was baptized, but Diriangén was openly hostile to the Spaniards. Western Nicaragua, at the Pacific Coast, became a port and shipbuilding facility for the Galleons plying the waters between Manila, Philippines and Acapulco, Mexico. The first Spanish permanent settlements were founded in 1524. That year, the conquistador Francisco Hernández de Córdoba founded two of Nicaragua's main cities: Granada on Lake Nicaragua, and then León, west of Lake Managua. Córdoba soon built defenses for the cities and fought against incursions by other conquistadors. Córdoba was later publicly beheaded for having defied his superior, Pedro Arias Dávila. Córdoba's tomb and remains were discovered in 2000 in the ruins of León Viejo. The clashes among Spanish forces did not impede their destruction of the indigenous people and their culture. The series of battles came to be known as the "War of the Captains". Pedro Arias Dávila was a winner; although he lost control of Panama, he moved to Nicaragua and established his base in León. In 1527, León became the capital of the colony. Through diplomacy, Arias Dávila became the colony's first governor. Without women in their parties, the Spanish conquerors took Nahua and Chorotega wives and partners, beginning the multiethnic mix of indigenous and European stock now known as "mestizo", which constitutes the great majority of the population in western Nicaragua. Many indigenous people were killed by European infectious diseases, compounded by neglect by the Spaniards, who controlled their subsistence. Many other indigenous peoples were captured and transported as slaves to Panama and Peru between 1526 and 1540. In 1610, the Momotombo volcano erupted, destroying the city of León. The city was rebuilt northwest of the original, which is now known as the ruins of León Viejo. During the American Revolutionary War, Central America was subject to conflict between Britain and Spain. British navy admiral Horatio Nelson led expeditions in the Battle of San Fernando de Omoa in 1779 and on the San Juan River in 1780, the latter of which had temporary success before being abandoned due to disease. Independent Nicaragua from 1821 to 1909 The Act of Independence of Central America dissolved the Captaincy General of Guatemala in September 1821, and Nicaragua soon became part of the First Mexican Empire. In July 1823, after the overthrow of the Mexican monarchy in March of the same year, Nicaragua joined the newly formed United Provinces of Central America, country later known as the Federal Republic of Central America. Nicaragua definitively became an independent republic in 1838. The early years of independence were characterized by rivalry between the Liberal elite of León and the Conservative elite of Granada, which often degenerated into civil war, particularly during the 1840s and 1850s. Managua rose to undisputed preeminence as the nation's capital in 1852 to allay the rivalry between the two feuding cities. Following the start (1848) of the California Gold Rush, Nicaragua provided a route for travelers from the eastern United States to journey to California by sea, via the San Juan River and Lake Nicaragua. Invited by the Liberals in 1855 to join their struggle against the Conservatives, the United States adventurer and filibuster William Walker set himself up as President of Nicaragua after conducting a farcical election in 1856; his presidency lasted less than a year. Military forces from Costa Rica, Honduras, El Salvador, Guatemala, and Nicaragua itself united to drive Walker out of Nicaragua in 1857, bringing three decades of Conservative rule. Great Britain, which had claimed the Mosquito Coast as a protectorate since 1655, delegated the area to Honduras in 1859 before transferring it to Nicaragua in 1860. The Mosquito Coast remained an autonomous area until 1894. José Santos Zelaya, President of Nicaragua from 1893 to 1909, negotiated the integration of the Mosquito Coast into Nicaragua. In his honor, the region became "Zelaya Department". Throughout the late 19th-century, the United States and several European powers considered various schemes to link the Pacific Ocean to the Atlantic by building a canal across Nicaragua. United States occupation (1909–1933) In 1909, the United States supported the conservative-led forces rebelling against President Zelaya. U.S. motives included differences over the proposed Nicaragua Canal, Nicaragua's potential to destabilize the region, and Zelaya's attempts to regulate foreign access to Nicaraguan natural resources. On November 18, 1909, U.S. warships were sent to the area after 500 revolutionaries (including two Americans) were executed by order of Zelaya. The U.S. justified the intervention by claiming to protect U.S. lives and property. Zelaya resigned later that year. In August 1912, the President of Nicaragua, Adolfo Díaz, requested the secretary of war, General Luis Mena, to resign for fear he was leading an insurrection. Mena fled Managua with his brother, the chief of police of Managua, to start an insurrection. After Mena's troops captured steam boats of an American company, the U.S. delegation asked President Díaz to ensure the safety of American citizens and property during the insurrection. He replied he could not, and asked the U.S. to intervene in the conflict. U.S. Marines occupied Nicaragua from 1912 to 1933, except for a nine-month period beginning in 1925. In 1914, the Bryan–Chamorro Treaty was signed, giving the U.S. control over a proposed canal through Nicaragua, as well as leases for potential canal defenses. After the U.S. Marines left, another violent conflict between Liberals and Conservatives in 1926, resulted in the return of U.S. Marines. From 1927 to 1933, rebel general Augusto César Sandino led a sustained guerrilla war against the Conservative regime and then against the U.S. Marines, whom he fought for over five years. When the Americans left in 1933, they set up the Guardia Nacional (national guard), a combined military and police force trained and equipped by the Americans and designed to be loyal to U.S. interests. After the U.S. Marines withdrew from Nicaragua in January 1933, Sandino and the newly elected administration of President Juan Bautista Sacasa reached an agreement that Sandino would cease his guerrilla activities in return for amnesty, a land grant for an agricultural colony, and retention of an armed band of 100 men for a year. However, due to a growing hostility between Sandino and National Guard director Anastasio Somoza García and a fear of armed opposition from Sandino, Somoza García ordered his assassination. Sacasa invited Sandino for dinner and to sign a peace treaty at the Presidential House on the night of February 21, 1934. After leaving the Presidential House, Sandino's car was stopped by National Guard soldiers and they kidnapped him. Later that night, Sandino was assassinated by National Guard soldiers. Later, hundreds of men, women, and children from Sandino's agricultural colony were executed. Somoza dynasty (1927–1979) Nicaragua has experienced several military dictatorships, the longest being the hereditary dictatorship of the Somoza family, who ruled for 43 nonconsecutive years during the 20th century. The Somoza family came to power as part of a U.S.-engineered pact in 1927 that stipulated the formation of the Guardia Nacional to replace the marines who had long reigned in the country. Somoza García slowly eliminated officers in the national guard who might have stood in his way, and then deposed Sacasa and became president on January 1, 1937, in a rigged election. In 1941, during the Second World War, Nicaragua declared war on Japan (8 December), Germany (11 December), Italy (11 December), Bulgaria (19 December), Hungary (19 December) and Romania (19 December). Only Romania reciprocated, declaring war on Nicaragua on the same day (19 December 1941). No soldiers were sent to the war, but Somoza García confiscated properties held by German Nicaraguan residents. In 1945, Nicaragua was among the first countries to ratify the United Nations Charter. On September 29, 1956, Somoza García was shot to death by Rigoberto López Pérez, a 27-year-old Liberal Nicaraguan poet. Luis Somoza Debayle, the eldest son of the late president, was appointed president by the congress and officially took charge of the country. He is remembered by some as moderate, but after only a few years in power died of a heart attack. His successor as president was René Schick Gutiérrez, whom most Nicaraguans viewed "as nothing more than a puppet of the Somozas". Somoza García's youngest son, Anastasio Somoza Debayle, often referred to simply as "Somoza", became president in 1967. An earthquake in 1972 destroyed nearly 90% of Managua, including much of its infrastructure. Instead of helping to rebuild the city, Somoza siphoned off relief money. The mishandling of relief money also prompted Pittsburgh Pirates star Roberto Clemente to personally fly to Managua on December 31, 1972, but he died en route in an airplane accident. Even the economic elite were reluctant to support Somoza, as he had acquired monopolies in industries that were key to rebuilding the nation. The Somoza family was among a few families or groups of influential firms which reaped most of the benefits of the country's growth from the 1950s to the 1970s. When Somoza was deposed by the Sandinistas in 1979, the family's worth was estimated to be between $500 million and $1.5 billion. Nicaraguan Revolution (1960s–1990) In 1961, Carlos Fonseca looked back to the historical figure of Sandino, and along with two other people (one of whom was believed to be Casimiro Sotelo, who was later assassinated), founded the Sandinista National Liberation Front (FSLN). After the 1972 earthquake and Somoza's apparent corruption, the ranks of the Sandinistas were flooded with young disaffected Nicaraguans who no longer had anything to lose. In December 1974, a group of the FSLN, in an attempt to kidnap U.S. ambassador Turner Shelton, held some Managuan partygoers hostage (after killing the host, former agriculture minister, Jose Maria Castillo), until the Somozan government met their demands for a large ransom and free transport to Cuba. Somoza granted this, then subsequently sent his national guard out into the countryside to look for the kidnappers, described by opponents of the kidnapping as "terrorists". On January 10, 1978, Pedro Joaquín Chamorro Cardenal, the editor of the national newspaper La Prensa and ardent opponent of Somoza, was assassinated. It is alleged that the planners and perpetrators of the murder were at the highest echelons of the Somoza regime. The Sandinistas forcefully took power in July 1979, ousting Somoza, and prompting the exodus of the majority of Nicaragua's middle class, wealthy landowners, and professionals, many of whom settled in the United States. The Carter administration decided to work with the new government, while attaching a provision for aid forfeiture if it was found to be assisting insurgencies in neighboring countries. Somoza fled the country and eventually ended up in Paraguay, where he was assassinated in September 1980, allegedly by members of the Argentinian Revolutionary Workers' Party. In 1980, the Carter administration provided $60 million in aid to Nicaragua under the Sandinistas, but the aid was suspended when the administration obtained evidence of Nicaraguan shipment of arms to El Salvadoran rebels. In response to the coming to power of the Sandinistas, various rebel groups collectively known as the "contras" were formed to oppose the new government. The Reagan administration authorized the CIA to help the contra rebels with funding, weapons and training. The contras operated from camps in the neighboring countries of Honduras to the north and Costa Rica to the south. They engaged in a systematic campaign of terror among rural Nicaraguans to disrupt the social reform projects of the Sandinistas. Several historians have criticized the contra campaign and the Reagan administration's support for the Contras, citing the brutality and numerous human rights violations of the contras. LaRamee and Polakoff, for example, describe the destruction of health centers, schools, and cooperatives at the hands of the rebels, and others have contended that murder, rape, and torture occurred on a large scale in contra-dominated areas. The U.S. also carried out a campaign of economic sabotage, and disrupted shipping by planting underwater mines in Nicaragua's port of Corinto, an action condemned by the International Court of Justice as illegal. The court also found that the U.S. encouraged acts contrary to humanitarian law by producing the manual Psychological Operations in Guerrilla Warfare and disseminating it to the contras. The manual, among other things, advised on how to rationalize killings of civilians. The U.S. also sought to place economic pressure on the Sandinistas, and the Reagan administration imposed a full trade embargo. The Sandinistas were also accused of human rights abuses including torture, disappearances and mass executions. The Inter-American Commission on Human Rights investigated abuses by Sandinista forces, including an execution of 35 to 40 Miskitos in December 1981, and an execution of 75 people in November 1984. In the Nicaraguan general elections of 1984, which were judged to have been free and fair, the Sandinistas won the parliamentary election and their leader Daniel Ortega won the presidential election. The Reagan administration criticized the elections as a "sham" based on the claim that Arturo Cruz, the candidate nominated by the Coordinadora Democrática Nicaragüense, comprising three right wing political parties, did not participate in the elections. However, the administration privately argued against Cruz's participation for fear that his involvement would legitimize the elections, and thus weaken the case for American aid to the contras. According to Martin Kriele, the results of the election were rigged. In 1983 the U.S. Congress prohibited federal funding of the contras, but the Reagan administration illegally continued to back them by covertly selling arms to Iran and channeling the proceeds to the contras (the Iran–Contra affair), for which several members of the Reagan administration were convicted of felonies. The International Court of Justice, in regard to the case of Nicaragua v. United States in 1984, found, "the United States of America was under an obligation to make reparation to the Republic of Nicaragua for all injury caused to Nicaragua by certain breaches of obligations under customary international law and treaty-law committed by the United States of America". During the war between the contras and the Sandinistas, 30,000 people were killed. Post-war (1990–present) In the Nicaraguan general election, 1990, a coalition of anti-Sandinista parties (from the left and right of the political spectrum) led by Violeta Chamorro, the widow of Pedro Joaquín Chamorro Cardenal, defeated the Sandinistas. The defeat shocked the Sandinistas, who had expected to win. Exit polls of Nicaraguans reported Chamorro's victory over Ortega was achieved with a 55% majority. Chamorro was the first woman president of Nicaragua. Ortega vowed he would govern desde abajo (from below). Chamorro came to office with an economy in ruins, primarily because of the financial and social costs of the contra war with the Sandinista-led government. In the next election, the Nicaraguan general election, 1996, Daniel Ortega and the Sandinistas of the FSLN lost again, this time to Arnoldo Alemán of the Constitutional Liberal Party (PLC). In the 2001 elections, the PLC again defeated the FSLN, with Alemán's Vice President Enrique Bolaños succeeding him as president. However, Alemán was convicted and sentenced in 2003 to 20 years in prison for embezzlement, money laundering, and corruption; liberal and Sandinista parliament members combined to strip the presidential powers of President Bolaños and his ministers, calling for his resignation and threatening impeachment. The Sandinistas said they no longer supported Bolaños after U.S. Secretary of State Colin Powell told Bolaños to distance from the FSLN. This "slow motion coup d'état" was averted partially by pressure from the Central American presidents, who vowed not to recognize any movement that removed Bolaños; the U.S., the OAS, and the European Union also opposed the action. Before the general elections on November 5, 2006, the National Assembly passed a bill further restricting abortion in Nicaragua. As a result, Nicaragua is one of five countries in the world where abortion is illegal with no exceptions. Legislative and presidential elections took place on November 5, 2006. Ortega returned to the presidency with 37.99% of the vote. This percentage was enough to win the presidency outright, because of a change in electoral law which lowered the percentage requiring a runoff election from 45% to 35% (with a 5% margin of victory). Nicaragua's 2011 general election resulted in re-election of Ortega, with a landslide victory and 62.46% of the vote. In 2014 the National Assembly approved changes to the constitution allowing Ortega to run for a third successive term. In November 2016, Ortega was elected for his third consecutive term (his fourth overall). International monitoring of the elections was initially prohibited, and as a result the validity of the elections has been disputed, but observation by the OAS was announced in October. Ortega was reported by Nicaraguan election officials as having received 72% of the vote. However the Broad Front for Democracy (FAD), having promoted boycotts of the elections, claimed that 70% of voters had abstained (while election officials claimed 65.8% participation). In April 2018, demonstrations opposed a decree increasing taxes and reducing benefits in the country's pension system. Local independent press organizations had documented at least 19 dead and over 100 missing in the ensuing conflict. A reporter from NPR spoke to protestors who explained that while the initial issue was about the pension reform, the uprisings that spread across the country reflected many grievances about the government's time in office, and that the fight is for President Ortega and his vice president wife to step down. April 24, 2018 marked the day of the greatest march in opposition of the Sandinista party. On May 2, 2018, university-student leaders publicly announced that they give the government seven days to set a date and time for a dialogue that was promised to the people due to the recent events of repression. The students also scheduled another march on that same day for a peaceful protest. As of May 2018, estimates of the death toll were as high as 63, many of them student protesters, and the wounded totalled more than 400. Following a working visit from May 17 to 21, the Inter-American Commission on Human Rights adopted precautionary measures aimed at protecting members of the student movement and their families after testimonies indicated the majority of them had suffered acts of violence and death threats for their participation. In the last week of May, thousands who accuse Mr. Ortega and his wife of acting like dictators joined in resuming anti-government rallies after attempted peace talks have remained unresolved. Geography and climate Nicaragua occupies a landmass of , which makes it slightly larger than England. Nicaragua has three distinct geographical regions: the Pacific lowlands – fertile valleys which the Spanish colonists settled, the Amerrisque Mountains (North-central highlands), and the Mosquito Coast (Atlantic lowlands/Caribbean lowlands). The low plains of the Atlantic Coast are wide in areas. They have long been exploited for their natural resources. On the Pacific side of Nicaragua are the two largest fresh water lakes in Central America—Lake Managua and Lake Nicaragua. Surrounding these lakes and extending to their northwest along the rift valley of the Gulf of Fonseca are fertile lowland plains, with soil highly enriched by ash from nearby volcanoes of the central highlands. Nicaragua's abundance of biologically significant and unique ecosystems contribute to Mesoamerica's designation as a biodiversity hotspot. Nicaragua has made efforts to become less dependent on fossil fuels, and it expects to acquire 90% of its energy from renewable resources by the year 2020. Nicaragua was one of the few countries that did not enter an INDC at COP21. Nicaragua initially chose not to join the Paris Climate Accord because it felt that "much more action is required" by individual countries on restricting global temperature rise. However, in October 2017, Nicaragua made the decision to join the agreement. It ratified this agreement on November 22, 2017. Nearly one fifth of Nicaragua is designated as protected areas like national parks, nature reserves, and biological reserves. The country had a 2019 Forest Landscape Integrity Index mean score of 3.63/10, ranking it 146th globally out of 172 countries. Geophysically, Nicaragua is surrounded by the Caribbean Plate, an oceanic tectonic plate underlying Central America and the Cocos Plate. Since Central America is a major subduction zone, Nicaragua hosts most of the Central American Volcanic Arc. Pacific lowlands In the west of the country, these lowlands consist of a broad, hot, fertile plain. Punctuating this plain are several large volcanoes of the Cordillera Los Maribios mountain range, including Mombacho just outside Granada, and Momotombo near León. The lowland area runs from the Gulf of Fonseca to Nicaragua's Pacific border with Costa Rica south of Lake Nicaragua. Lake Nicaragua is the largest freshwater lake in Central America (20th largest in the world), and is home to some of the world's rare freshwater sharks (Nicaraguan shark). The Pacific lowlands region is the most populous, with over half of the nation's population. The eruptions of western Nicaragua's 40 volcanoes, many of which are still active, have sometimes devastated settlements but also have enriched the land with layers of fertile ash. The geologic activity that produces vulcanism also breeds powerful earthquakes. Tremors occur regularly throughout the Pacific zone, and earthquakes have nearly destroyed the capital city, Managua, more than once. Most of the Pacific zone is tierra caliente, the "hot land" of tropical Spanish America at elevations under . Temperatures remain virtually constant throughout the year, with highs ranging between . After a dry season lasting from November to April, rains begin in May and continue to October, giving the Pacific lowlands of precipitation. Good soils and a favourable climate combine to make western Nicaragua the country's economic and demographic centre. The southwestern shore of Lake Nicaragua lies within of the Pacific Ocean. Thus the lake and the San Juan River were often proposed in the 19th century as the longest part of a |
to relocate into colonization projects in the rainforest. Some moved eastward into the hills, where they cleared forests in order to plant crops. Soil erosion forced them, however, to abandon their land and move deeper into the rainforest. Cattle ranchers then claimed the abandoned land. Peasants and ranchers continued this movement deep into the rain forest. By the early 1970s, Nicaragua had become the United States' top beef supplier. The beef supported fast-food chains and pet food production. President Anastasio Somoza Debayle owned the largest slaughterhouse in Nicaragua, as well as six meat-packing plants in Miami, Florida. Also in the 1950s and 1960s, 40% of all U.S. pesticide exports went to Central America. Nicaragua and its neighbors widely used compounds banned in the U.S., such as DDT, endrin, dieldrin and lindane. In 1977 a study revealed that mothers living in León had 45 times more DDT in their breast milk than the World Health Organization safe level. Sandinista insurrection (1972–1979) A major turning point was the December 1972 Managua earthquake that killed over 10,000 people and left 500,000 homeless. A great deal of international relief was sent to the nation. Some Nicaraguan historians point to the earthquake that devastated Managua as the final 'nail in the coffin' for Somoza; some 90% of the city was destroyed. Somoza's brazen corruption, mishandling of relief (which prompted Pittsburgh Pirates star Roberto Clemente to fly to Managua on December 31, 1972, to try to help - a flight that ended in his death) and refusal to rebuild Managua, flooded the ranks of the Sandinistas with young disaffected Nicaraguans who no longer had anything to lose. The Sandinistas received some support from Cuba and the Soviet Union. On 27 December 1974, a group of nine FSLN guerrillas invaded a party at the home of a former Minister of Agriculture, killing him and three guards in the process of taking several leading government officials and prominent businessmen hostage. In return for the hostages they succeeded in getting the government to pay US$2 million ransom, broadcast an FSLN declaration on the radio and in the opposition newspaper La Prensa, release fourteen FSLN members from jail, and fly the raiders and the released FSLN members to Cuba. Archbishop Miguel Obando y Bravo acted as an intermediary during the negotiations. The incident humiliated the government and greatly enhanced the prestige of the FSLN. Somoza, in his memoirs, refers to this action as the beginning of a sharp escalation in terms of Sandinista attacks and government reprisals. Martial law was declared in 1975, and the National Guard began to raze villages in the jungle suspected of supporting the rebels. Human rights groups condemned the actions, but U.S. President Gerald Ford refused to break the U.S. alliance with Somoza. The country tipped into full-scale civil war with the 1978 murder of Pedro Chamorro, who had opposed violence against the regime. 50,000 turned out for his funeral. It was assumed by many that Somoza had ordered his assassination; suspected plotters included the dictator's son, “El Chiguin”, Somoza's President of Housing, Cornelio Hueck, Somoza's Attorney General, and Pedro Ramos, a close Cuban ally who commercialized in illegal blood plasma. A nationwide strike, including labour and private businesses, commenced in protest, demanding an end to the dictatorship. At the same time, the Sandinistas stepped up their rate of guerrilla activity. Several towns, assisted by Sandinista guerrillas, expelled their National Guard units. Somoza responded with increasing violence and repression. When León became the first city in Nicaragua to fall to the Sandinistas, he responded with aerial bombardment, famously ordering the air force to "bomb everything that moves until it stops moving." The U.S. media grew increasingly unfavorable in its reporting on the situation in Nicaragua. Realizing that the Somoza dictatorship was unsustainable, the Carter administration attempted to force him to leave Nicaragua. Somoza refused and sought to maintain his power through the National Guard. At that point, the U.S. ambassador sent a cable to the White House saying it would be "ill-advised" to call off the bombing, because such an action would help the Sandinistas gain power. When ABC reporter Bill Stewart was executed by the National Guard, and graphic film of the killing was broadcast on American TV, the American public became more hostile to Somoza. In the end, President Carter refused Somoza further U.S. military aid, believing that the repressive nature of the government had led to popular support for the Sandinista uprising. In May 1979, another general strike was called, and the FSLN launched a major push to take control of the country. By mid July they had Somoza and the National Guard isolated in Managua. Sandinista period (1979–1990) As Nicaragua's government collapsed and the National Guard commanders escaped with Somoza, the U.S. first promised and then denied them exile in Miami. The rebels advanced on the capital victoriously. On July 19, 1979, a new government was proclaimed under a provisional junta headed by 33-year-old Daniel Ortega, and included Violeta Chamorro, Pedro's widow. Somoza eventually ended up in Paraguay, where he was assassinated in September 1980, allegedly by members of the 'Argentinian Revolutionary Workers' Party. The United Nations estimated material damage from the revolutionary war to be US$480 million. The FSLN took over a nation plagued by malnutrition, disease, and pesticide contaminations. Lake Managua was considered dead because of decades of pesticide runoff, toxic chemical pollution from lakeside factories, and untreated sewage. Soil erosion and dust storms were also a problem in Nicaragua at the time due to deforestation. To tackle these crises, the FSLN created the Nicaraguan Institute of Natural Resources and the Environment. The key large-scale programs of the Sandinistas included a National Literacy Crusade from March to August 1980. Nicaragua received international recognition for gains in literacy, health care, education, childcare, unions, and land reform. Managua became the second capital in the hemisphere after Cuba to host an embassy from North Korea. Due to tensions between their Soviet sponsors and China, the Sandinistas allowed Taiwan to retain its mission and refused to allow a Chinese mission in the country. The Sandinistas won the national election of November 4, 1984, gathering 67% of the vote. The election was certified as "free and fair" by the majority of international observers. The Nicaraguan political opposition and the Reagan administration claimed political restrictions were placed on the opposition by the government. The primary opposition candidate was the U.S.-backed Arturo Cruz, who succumbed to pressure from the United States government not to take part in the 1984 elections; later US officials were quoted as saying, "the (Reagan) Administration never contemplated letting Cruz stay in the race, because then the Sandinistas could justifiably claim that the elections were legitimate." Three right-wing opposition parties (Coordinadora Democrática Nicaragüense) boycotted the election, claiming that the Sandinistas were manipulating the media and that the elections might not be fair. Other opposition parties such as the Conservative Democratic Party and the Independent Liberal party, were both free to denounce the Sandinista government and participate in the elections. Ortega was victorious, but the long years of war had decimated Nicaragua's economy. Historian Christopher Andrew claimed that it was later discovered that the FSLN had, in fact, been suppressing right-wing opposition parties while leaving moderate parties alone, with Ortega claiming that the moderates "presented no danger and served as a convenient facade to the outside world". In 1993, the Library of Congress wrote "Foreign observers generally reported that the election was fair. Opposition groups, however, said that the FSLN domination of government organs, mass organizations groups, and much of the media created a climate of intimidation that precluded a truly open election.". Communist leanings and U.S. backed Contras American support for the long rule of the Somoza family had soured relations, and the FSLN government was committed to a Marxist ideology, with many of the leading Sandinista continuing long-standing relationships with the Soviet Union and Cuba. United States President Jimmy Carter, who had cut off aid to Somoza's Nicaragua the previous year, initially hoped that continued American aid to the new government would keep the Sandinistas from forming a doctrinaire Marxist-Leninist government aligned with the Soviet bloc, but the Carter administration aid was minimal, and the Sandinistas turned to Cuban and Eastern European assistance to build a new army of 75,000, including T-55 tanks, heavy artillery and HIND attack helicopters, that made the Sandinista Army more powerful than its neighbors. The Soviets also pledged to provide MiG 21 fighters, but the aircraft were never delivered. With the election of Ronald Reagan in 1980, relations between the United States and the Sandinista regime became an active front in the Cold War. The Reagan administration insisted on the "Communist threat" posed by the Sandinistas—reacting particularly to the support provided to the Sandinistas by Cuba and the Soviets. The US suspended aid due to evidence of Sandinista support to FMLN rebels in El Salvador. Prior to U.S. aid withdrawal, FSLN politician Bayardo Arce, stated that "Nicaragua is the only country building its socialism with the dollars of imperialism." The Reagan administration responded by imposing economic sanctions and a trade embargo against Nicaragua. After a brief period of sanctions, Nicaragua was faced with a collapsing economy. The U.S. trained and financed the Contras, which were a counter-revolutionary group, based in neighboring Honduras to militarily oppose the Sandinista government. President Reagan called the Contras "the moral equivalent of our founding fathers." The Contras, groups of Somoza's National Guard who had fled to Honduras, were organized, trained and funded by CIA. The Contra chain of command included some ex-National Guardsmen, including Contra founder and commander Enrique Bermúdez and others, including ex-Sandinista hero Edén Pastora, who rejected the Leninist orientation of the Sandinistas. The Contras operated out of camps in the neighboring Honduras to the north and Costa Rica to the south. They engaged in a systematic campaign of terror amongst the rural Nicaraguan population to disrupt social reform projects of the Sandinistas. The US support for the Contras sparked widespread criticism from many quarters around the globe including within Nicaragua and the U.S., Democrats in Congress included. Several historians have criticized the contra campaign and the Reagan Administration's support for it, citing the brutality and numerous human rights violations of the Contras. LaRamee and Polakoff, for example, describe the destruction of health centers, schools and cooperatives at the hands of the rebels. Others have contended that large scale murder, rape and torture also occurred in Contra dominated areas. The US also sought to place economic pressure on the Sandinistas, and the Reagan administration imposed a full trade embargo. The Sandinistas were also accused of human rights abuses including torture, disappearances and mass executions. The Inter-American Commission on Human Rights investigated abuses by Sandinista forces, including an execution of 35 to 40 Miskitos in December 1981, and an execution of 75 people in November 1984. American pressure against the government escalated throughout 1983 and 1984; the Contras began a campaign of economic sabotage and disrupted shipping by planting underwater mines in Nicaragua's Port of Corinto, an action later condemned by the International Court of Justice as illegal. Daniel Ortega was elected President in 1984. The years of war and Nicaragua's economic situation had taken an unparalleled toll on Nicaragua. The US Government offered a political amnesty program that gave visas to any Nicaraguan without question. Nicaraguans (particularly wealthy on or those who had familial connections within the US) left the country in the largest emigration Nicaraguan history. On May 1, 1985, Reagan issued an executive order that imposed a full economic embargo on Nicaragua, which remained in force until March 1990. Nicaragua won a historic case against the U.S. at the International Court of Justice in 1986 (see Nicaragua v. United States), and the U.S. was ordered to pay Nicaragua $12 billion in reparations for violating Nicaraguan sovereignty by engaging in attacks against it. The United States withdrew its acceptance of the Court arguing it had no authority in matters of sovereign state relations. The United Nations General Assembly passed a resolution to pressure the U.S. to pay. Only Israel and El Salvador, which was backed in its own guerrilla insurgency, voted with the U.S.. Jeane Kirkpatrick, the American ambassador to the UN , criticized the Court as a "semi-judicial" body. In addition, the U.S. noted that Cuba and the Soviet Union had earlier committed the same violations against Nicaraguan sovereignty by providing training and ammunition to the Sandinistas against the Somoza regime. The International Court of Justice decision called the nature of the conflict in Nicaragua as one of aggression directed by a foreign power against Nicaragua. In a twelve to three vote, the Court's summary judgment against the United States stated that by: ...training, arming, equipping, financing and supplying the contra forces or otherwise encouraging, supporting and aiding military and paramilitary activities in and against Nicaragua, the United States has acted, against the Republic of Nicaragua, in breach of its obligation under customary international law not to intervene in the affairs of another State. In 1982, legislation was enacted by US Congress to prohibit further aid to the Contras. Reagan's officials attempted to illegally supply them out of the proceeds of arms sales to Iran and third party donations, triggering the Iran-Contra Affair of 1986–87. Mutual exhaustion, Sandinista fears of Contra unity and military success, and mediation by other regional governments led to the Sapoa ceasefire between the Sandinistas and the Contras on March 23, 1988. Subsequent agreements were designed to reintegrate the Contras and their supporters into Nicaraguan society in preparation for general elections Sixteen years of center-right rule (1990–2006) The FSLN lost to the National Opposition Union by 14 points in elections on February 25, 1990. ABC news had been predicting a 16-point Sandinista victory. At the beginning of Violeta Chamorro's nearly 7 years in office the Sandinistas still largely controlled the army, labor unions, and courts. Her government made moves towards consolidating democratic institutions, advancing national reconciliation, stabilizing the economy, privatizing state-owned enterprises. In February 1995, Sandinista Popular Army Cmdr. Gen. Humberto Ortega was replaced, in accordance with a new military code enacted in 1994 by Gen. Joaquín Cuadra, who espoused a policy of greater professionalism in the renamed Army of Nicaragua. A new police organization law, passed by the National Assembly and signed into law in August 1996, further codified both civilian control of the police and the professionalization of that law enforcement agency. The October 20, 1996 presidential, legislative, and mayoral elections also were judged free and fair by international observers and by the groundbreaking national electoral observer group Ética y Transparencia (Ethics and Transparency) despite a number of irregularities, due largely to logistical difficulties and a baroquely complicated electoral law. This time Nicaraguans elected former-Managua Mayor Arnoldo Alemán, leader of the center-right Liberal Alliance, which later consolidated into the Constitutional Liberal Party (PLC). | suspected as a Sandinista sympathizer, gave many ordinary Nicaraguans the idea that the Sandinistas were much stronger than was the case. Somoza acquired monopolies in industries that were key to rebuilding the nation, not allowing other members of the upper class to share the profits that would result from the reborn economic activity. This ultimately weakened Somoza since even the economic elite were reluctant to support him. In the 1950s a synthetic brand of cotton, one of Nicaragua's economic pillars of the epoch, was developed. This caused the price of cotton to decrease, placing the economy in great trouble. Landless peasants worked on large plantations during short harvest seasons and received wages as low as US$1 per day. In desperation, many of these poor laborers migrated east, seeking their own land near the rain forest. In 1968, the World Health Organization found that polluted water led to 17% of all Nicaraguan deaths. American economic involvement From 1945 to 1960, the U.S.-owned Nicaraguan Long Leaf Pine Company (NIPCO) directly paid the Somoza family millions of dollars in exchange for favorable benefits to the company, such as not having to re-forest clear cut areas. By 1961, NIPCO had cut all of the commercially viable coastal pines in northeast Nicaragua. Expansion of cotton plantations in the 1950s and cattle ranches in the 1960s forced peasant families from the areas they had farmed for decades. Some were forced by the National Guard to relocate into colonization projects in the rainforest. Some moved eastward into the hills, where they cleared forests in order to plant crops. Soil erosion forced them, however, to abandon their land and move deeper into the rainforest. Cattle ranchers then claimed the abandoned land. Peasants and ranchers continued this movement deep into the rain forest. By the early 1970s, Nicaragua had become the United States' top beef supplier. The beef supported fast-food chains and pet food production. President Anastasio Somoza Debayle owned the largest slaughterhouse in Nicaragua, as well as six meat-packing plants in Miami, Florida. Also in the 1950s and 1960s, 40% of all U.S. pesticide exports went to Central America. Nicaragua and its neighbors widely used compounds banned in the U.S., such as DDT, endrin, dieldrin and lindane. In 1977 a study revealed that mothers living in León had 45 times more DDT in their breast milk than the World Health Organization safe level. Sandinista insurrection (1972–1979) A major turning point was the December 1972 Managua earthquake that killed over 10,000 people and left 500,000 homeless. A great deal of international relief was sent to the nation. Some Nicaraguan historians point to the earthquake that devastated Managua as the final 'nail in the coffin' for Somoza; some 90% of the city was destroyed. Somoza's brazen corruption, mishandling of relief (which prompted Pittsburgh Pirates star Roberto Clemente to fly to Managua on December 31, 1972, to try to help - a flight that ended in his death) and refusal to rebuild Managua, flooded the ranks of the Sandinistas with young disaffected Nicaraguans who no longer had anything to lose. The Sandinistas received some support from Cuba and the Soviet Union. On 27 December 1974, a group of nine FSLN guerrillas invaded a party at the home of a former Minister of Agriculture, killing him and three guards in the process of taking several leading government officials and prominent businessmen hostage. In return for the hostages they succeeded in getting the government to pay US$2 million ransom, broadcast an FSLN declaration on the radio and in the opposition newspaper La Prensa, release fourteen FSLN members from jail, and fly the raiders and the released FSLN members to Cuba. Archbishop Miguel Obando y Bravo acted as an intermediary during the negotiations. The incident humiliated the government and greatly enhanced the prestige of the FSLN. Somoza, in his memoirs, refers to this action as the beginning of a sharp escalation in terms of Sandinista attacks and government reprisals. Martial law was declared in 1975, and the National Guard began to raze villages in the jungle suspected of supporting the rebels. Human rights groups condemned the actions, but U.S. President Gerald Ford refused to break the U.S. alliance with Somoza. The country tipped into full-scale civil war with the 1978 murder of Pedro Chamorro, who had opposed violence against the regime. 50,000 turned out for his funeral. It was assumed by many that Somoza had ordered his assassination; suspected plotters included the dictator's son, “El Chiguin”, Somoza's President of Housing, Cornelio Hueck, Somoza's Attorney General, and Pedro Ramos, a close Cuban ally who commercialized in illegal blood plasma. A nationwide strike, including labour and private businesses, commenced in protest, demanding an end to the dictatorship. At the same time, the Sandinistas stepped up their rate of guerrilla activity. Several towns, assisted by Sandinista guerrillas, expelled their National Guard units. Somoza responded with increasing violence and repression. When León became the first city in Nicaragua to fall to the Sandinistas, he responded with aerial bombardment, famously ordering the air force to "bomb everything that moves until it stops moving." The U.S. media grew increasingly unfavorable in its reporting on the situation in Nicaragua. Realizing that the Somoza dictatorship was unsustainable, the Carter administration attempted to force him to leave Nicaragua. Somoza refused and sought to maintain his power through the National Guard. At that point, the U.S. ambassador sent a cable to the White House saying it would be "ill-advised" to call off the bombing, because such an action would help the Sandinistas gain power. When ABC reporter Bill Stewart was executed by the National Guard, and graphic film of the killing was broadcast on American TV, the American public became more hostile to Somoza. In the end, President Carter refused Somoza further U.S. military aid, believing that the repressive nature of the government had led to popular support for the Sandinista uprising. In May 1979, another general strike was called, and the FSLN launched a major push to take control of the country. By mid July they had Somoza and the National Guard isolated in Managua. Sandinista period (1979–1990) As Nicaragua's government collapsed and the National Guard commanders escaped with Somoza, the U.S. first promised and then denied them exile in Miami. The rebels advanced on the capital victoriously. On July 19, 1979, a new government was proclaimed under a provisional junta headed by 33-year-old Daniel Ortega, and included Violeta Chamorro, Pedro's widow. Somoza eventually ended up in Paraguay, where he was assassinated in September 1980, allegedly by members of the 'Argentinian Revolutionary Workers' Party. The United Nations estimated material damage from the revolutionary war to be US$480 million. The FSLN took over a nation plagued by malnutrition, disease, and pesticide contaminations. Lake Managua was considered dead because of decades of pesticide runoff, toxic chemical pollution from lakeside factories, and untreated sewage. Soil erosion and dust storms were also a problem in Nicaragua at the time due to deforestation. To tackle these crises, the FSLN created the Nicaraguan Institute of Natural Resources and the Environment. The key large-scale programs of the Sandinistas included a National Literacy Crusade from March to August 1980. Nicaragua received international recognition for gains in literacy, health care, education, childcare, unions, and land reform. Managua became the second capital in the hemisphere after Cuba to host an embassy from North Korea. Due to tensions between their Soviet sponsors and China, the Sandinistas allowed Taiwan to retain its mission and refused to allow a Chinese mission in the country. The Sandinistas won the national election of November 4, 1984, gathering 67% of the vote. The election was certified as "free and fair" by the majority of international observers. The Nicaraguan political opposition and the Reagan administration claimed political restrictions were placed on the opposition by the government. The primary opposition candidate was the U.S.-backed Arturo Cruz, who succumbed to pressure from the United States government not to take part in the 1984 elections; later US officials were quoted as saying, "the (Reagan) Administration never contemplated letting Cruz stay in the race, because then the Sandinistas could justifiably claim that the elections were legitimate." Three right-wing opposition parties (Coordinadora Democrática Nicaragüense) boycotted the election, claiming that the Sandinistas were manipulating the media and that the elections might not be fair. Other opposition parties such as the Conservative Democratic Party and the Independent Liberal party, were both free to denounce the Sandinista government and participate in the elections. Ortega was victorious, but the long years of war had decimated Nicaragua's economy. Historian Christopher Andrew claimed that it was later discovered that the FSLN had, in fact, been suppressing right-wing opposition parties while leaving moderate parties alone, with Ortega claiming that the moderates "presented no danger and served as a convenient facade to the outside world". In 1993, the Library of Congress wrote "Foreign observers generally reported that the election was fair. Opposition groups, however, said that the FSLN domination of government organs, mass organizations groups, and much of the media created a climate of intimidation that precluded a truly open election.". Communist leanings and U.S. backed Contras American support for the long rule of the Somoza family had soured relations, and the FSLN government was committed to a Marxist ideology, with many of the leading Sandinista continuing long-standing relationships with the Soviet Union and Cuba. United States President Jimmy Carter, who had cut off aid to Somoza's Nicaragua the previous year, initially hoped that continued American aid to the new government would keep the Sandinistas from forming a doctrinaire Marxist-Leninist government aligned with the Soviet bloc, but the Carter administration aid was minimal, and the Sandinistas turned to Cuban and Eastern European assistance to build a new army of 75,000, including T-55 tanks, heavy artillery and HIND attack helicopters, that made the Sandinista Army more powerful than its neighbors. The Soviets also pledged to provide MiG 21 fighters, but the aircraft were never delivered. With the election of Ronald Reagan in 1980, relations between the United States and the Sandinista regime became an active front in the Cold War. The Reagan administration insisted on the "Communist threat" posed by the Sandinistas—reacting particularly to the support provided to the Sandinistas by Cuba and the Soviets. The US suspended aid due to evidence of Sandinista support to FMLN rebels in El Salvador. Prior to U.S. aid withdrawal, FSLN politician Bayardo Arce, stated that "Nicaragua is the only country building its socialism with the dollars of imperialism." The Reagan administration responded by imposing economic sanctions and a trade embargo against Nicaragua. After a brief period of sanctions, Nicaragua was faced with a collapsing economy. The U.S. trained and financed the Contras, which were a counter-revolutionary group, based in neighboring Honduras to militarily oppose the Sandinista government. President Reagan called the Contras "the moral equivalent of our founding fathers." The Contras, groups of Somoza's National Guard who had fled to Honduras, were organized, trained and funded by CIA. The Contra chain of command included some ex-National Guardsmen, including Contra founder and commander Enrique Bermúdez and others, including ex-Sandinista hero Edén Pastora, who rejected the Leninist orientation of the Sandinistas. The Contras operated out of camps in the neighboring Honduras to the north and Costa Rica to the south. They engaged in a systematic campaign of terror amongst the rural Nicaraguan population to disrupt social reform projects of the Sandinistas. The US support for the Contras sparked widespread criticism from many quarters around the globe including within Nicaragua and the U.S., Democrats in Congress included. Several historians have criticized the contra campaign and the Reagan Administration's support for it, citing the brutality and numerous human rights violations of the Contras. LaRamee and Polakoff, for example, describe the destruction of health centers, schools and cooperatives at the hands of the rebels. Others have contended that large scale murder, rape and torture also occurred in Contra dominated areas. The US also sought to place economic pressure on the Sandinistas, and the Reagan administration imposed a full trade embargo. The Sandinistas were also accused of human rights abuses including torture, disappearances and mass executions. The Inter-American Commission on Human Rights investigated abuses by Sandinista forces, including an execution of 35 to 40 Miskitos in December 1981, and an execution of 75 people in November 1984. American pressure against the government escalated throughout 1983 and 1984; the Contras began a campaign of economic sabotage and disrupted shipping by planting underwater mines in Nicaragua's Port of Corinto, an action later condemned by the International Court of Justice as illegal. Daniel Ortega was elected President in 1984. The years of war and Nicaragua's economic situation had taken an unparalleled toll on Nicaragua. The US Government offered a political amnesty program that gave visas to any Nicaraguan without question. Nicaraguans (particularly wealthy on or those who had familial connections within the US) left the country in the largest emigration Nicaraguan history. On May 1, 1985, Reagan issued an executive order that imposed a full economic embargo on Nicaragua, which remained in force until March 1990. Nicaragua won a historic case against the U.S. at the International Court of Justice in 1986 (see Nicaragua v. United States), and the U.S. was ordered to pay Nicaragua $12 billion in reparations for violating Nicaraguan sovereignty by engaging in attacks against it. The United States withdrew its acceptance of the Court arguing it had no authority in matters of sovereign state relations. The United Nations General Assembly passed a resolution to pressure the U.S. to pay. Only Israel and El Salvador, which was backed in its own guerrilla insurgency, voted with the U.S.. Jeane Kirkpatrick, the American ambassador to the UN , criticized the Court as a "semi-judicial" body. In addition, the U.S. noted that Cuba and the Soviet Union had earlier committed the same violations against Nicaraguan sovereignty by providing training and ammunition to the Sandinistas against the Somoza regime. The International Court of Justice decision called the nature of the conflict in Nicaragua as one of aggression directed by a foreign power against Nicaragua. In a twelve to three vote, the Court's summary judgment against the United States stated that by: ...training, arming, equipping, financing and supplying the contra forces or otherwise encouraging, supporting and aiding military and paramilitary activities in and against Nicaragua, the United States has acted, against the Republic of Nicaragua, in breach of its obligation under customary international law not to intervene in the affairs of another State. In 1982, legislation was enacted by US Congress to prohibit further aid to the Contras. Reagan's officials attempted to illegally supply them out of the proceeds of arms sales to Iran and third party donations, triggering the Iran-Contra Affair of 1986–87. Mutual exhaustion, Sandinista fears of Contra unity and military success, and mediation by other regional governments led to the Sapoa ceasefire between the Sandinistas and the Contras on March 23, 1988. Subsequent agreements were designed to reintegrate the Contras and their supporters into Nicaraguan society in preparation for general elections Sixteen years of center-right rule (1990–2006) The FSLN lost to the National Opposition Union by 14 points in elections on February 25, 1990. ABC news had been predicting a 16-point Sandinista victory. At the beginning of Violeta Chamorro's nearly 7 years in office the Sandinistas still largely controlled the army, labor unions, and courts. Her government made moves towards consolidating democratic institutions, advancing national reconciliation, stabilizing the economy, privatizing state-owned enterprises. In February 1995, Sandinista Popular Army Cmdr. Gen. Humberto Ortega was replaced, in accordance with a new military code enacted in 1994 by Gen. Joaquín Cuadra, who espoused a policy of greater professionalism in the renamed Army of Nicaragua. A new police organization law, passed by the National Assembly and signed into law in August 1996, further codified both civilian control of the police and the professionalization of that law enforcement agency. The October 20, 1996 presidential, legislative, and mayoral elections also were judged free and fair by international observers and by the groundbreaking national electoral observer group Ética y Transparencia (Ethics and Transparency) despite a number of irregularities, due largely |
passing southeast across the isthmus from the Golfo de Fonseca to the Río San Juan. The rift is occupied in part by the largest freshwater lakes in Central America: Lago de Managua (56 kilometers long and 24 kilometers wide) and Lago de Nicaragua (about 160 kilometers long and 75 kilometers wide). These two lakes are joined by the Río Tipitapa, which flows south into Lago de Nicaragua. Lago de Nicaragua in turn drains into the Río San Juan (the boundary between Nicaragua and Costa Rica), which flows through the southern part of the rift lowlands to the Caribbean Sea. The valley of the Río San Juan forms a natural passageway close to sea level across the Nicaraguan isthmus from the Caribbean Sea to Lago de Nicaragua and the rift. From the southwest edge of Lago de Nicaragua, it is only nineteen kilometers to the Pacific Ocean. This route was considered as a possible alternative to the Panama Canal at various times in the past. Surrounding the lakes and extending northwest of them along the rift valley to the Golfo de Fonseca are fertile lowland plains highly enriched with volcanic ash from nearby volcanoes. These lowlands are densely populated and well cultivated. More directly west of the lake region is a narrow line of ash-covered hills and volcanoes that separate the lakes from the Pacific Ocean. This line is highest in the central portion near the cities of León and Managua. Because Western Nicaragua is located where two major tectonic plates collide, it is subject to earthquakes and volcanic eruptions. Although periodic volcanic eruptions have caused agricultural damage from fumes and ash, earthquakes have been by far more destructive to life and property. Hundreds of shocks occur each year, some of which cause severe damage. The capital city of Managua was virtually destroyed in 1931 and again in 1972. Central highlands The triangular area known as the central highlands lies northeast and east of the Pacific lowlands. These rugged mountains are composed of ridges 900 to 1,800 meters high and a mixed forest of oak and pine alternating with deep valleys that drain primarily toward the Caribbean. Very few significant streams flow west to the Pacific Ocean. Those that do are steep, short, and flow intermittently. The relatively dry western slopes of the central highlands, protected by the ridges of the highlands from the moist winds of the Caribbean, have drawn farmers from the Pacific region since colonial times. The eastern slopes of the highlands are covered with rain forests and are lightly populated with pioneer agriculturalists and small communities of indigenous people. Caribbean lowland The eastern Caribbean lowlands of Nicaragua form the extensive and exaggerated (occupying more than 50 percent of national territory) and still sparsely | hot, humid area that includes coastal plains, the eastern spurs of the central highlands, and the lower portion of the Río San Juan basin. The soil is generally leached and infertile. Pine and palm savannas predominate as far south as the Laguna de Perlas. Tropical rain forests are characteristic from the Laguna de Perlas to the Río San Juan, in the interior west of the savannas, and along rivers through the savannas. Fertile soils are found only along the natural levees and narrow floodplains of the numerous rivers, including the Escondido, the Río Grande de Matagalpa, the Prinzapolka, and the Coco, and along the many lesser streams that rise in the central highlands and cross the region en route to the complex of shallow bays, lagoons, and salt marshes of the Caribbean coast. Climate Temperature varies little with the seasons in Nicaragua and is largely a function of elevation. The tierra cliente, or "hot land", is characteristic of the foothills and lowlands from sea level to about of elevation. Here, daytime temperatures average , and night temperatures drop to most of the year. The tierra templada, or "temperate land", is characteristic of most of the central highlands, where elevations range between . Here, daytime temperatures are mild (, and nights are cool (). Tierra fria, the "cold land" at elevations above , is found only on and near the highest peaks of the central highlands. Daytime averages in this region are , with nighttime lows below . Rainfall Rainfall varies greatly in Nicaragua. The Caribbean lowlands are the wettest section of Central America, receiving between of rain annually. The western slopes of the central highlands and the Pacific lowlands receive considerably less annual rainfall, being protected from moisture-laden Caribbean trade winds by the peaks of the central highlands. Mean annual precipitation for the rift valley and western slopes of the highlands ranges from . Rainfall is seasonal—May through October is the rainy season, and December through April is the driest period. During the rainy season, Eastern Nicaragua is subject to heavy flooding along the upper and middle reaches of all major rivers. Near the coast, where river courses widen and river banks and natural levees are low, floodwaters spill over onto the floodplains until large sections of the lowlands become continuous sheets of water. River bank agricultural plots are often heavily damaged, and considerable numbers of savanna animals die during these floods. The coast is also subject to destructive tropical storms and hurricanes, particularly from July through October. The high winds and floods, accompanying these storms often cause considerable destruction of property. In addition, heavy rains (called papagayo storms) accompanying the passage of a cold front or a low-pressure area may sweep from the north through both eastern and western Nicaragua (particularly the rift valley) from November through March. Hurricanes or heavy rains in the central highlands where agriculture has destroyed much of the natural vegetation also cause considerable crop damage and soil erosion. In 1988, Hurricane Joan forced hundreds of thousands of Nicaraguans to flee their homes and caused more than US$1 billion in damage, most of it along the Caribbean coast. In November 2020, two major hurricanes: Eta and Iota, made landfall on the nation in nearly same locations in consecutive weeks, causing hundreds of deaths throughout the caribbean region and causing millions of dollars in damage. Environment Nicaragua is subject to destructive earthquakes, volcanoes, landslides, and occasionally severe hurricanes. It currently faces deforestation, soil erosion, and water pollution. It is a party to the United Nations Framework Convention on Climate Change, the Climate Change-Kyoto Protocol, the Nuclear Test Ban, and the Ozone Layer Protection, and has signed but not ratified the Law of the Sea. Extreme points Northernmost point: North of Liwa Sirpe Southernmost point: Trinidad, Río San Juan Westernmost point: Pacific coast at Gulf of Fonseca, Chinandega Department Easternmost point: Miskito Cays archipelago, North Caribbean Coast |
Population distribution Ninety percent of Nicaraguans live in the Pacific lowlands and the adjacent interior highlands. The population is 54% urban. The most populous city in Nicaragua is the capital city, Managua, with a population of 1.2 million (2005). As of 2005, over 4.4 million inhabitants live in the Pacific, Central and North regions of the country. There are 2.7 million residents in the Pacific region. The Caribbean region has an estimated 700,000 residents. In addition, many Nicaraguans live abroad. Departments by population Vital statistics Registration of vital events is in Nicaragua not complete. The Population Department of the United Nations prepared the following estimates. Fertility and births Total Fertility Rate (TFR) (Wanted Fertility Rate) and Crude Birth Rate (CBR): Births and deaths Ethnic groups In the 19th century, there had been a substantial indigenous minority, but this group was also largely assimilated culturally into the mestizo majority. Primarily in the 19th century, Nicaragua saw several waves of immigration from other European nations. In particular the northern cities of Estelí, Jinotega and Matagalpa have significant fourth generation Germans. Most of Nicaragua's population lives in the western region of the country in the departments of Managua, Granada and León. According to the 2005 census 443,847 (8.6%) residents consider themselves to belong to an indigenous people or to an ethnic community. The remaining majority of the Nicaraguan population (91.6%) are deemed mestizo and white, with the majority of these being of Spanish, with some German, Italian, Portuguese and French ancestry. Mestizos and whites mainly reside in the western region of the country. Possibly also a part of the black or Afro-Nicaraguan population, which mainly resides on the country's sparsely populated Caribbean (or Atlantic) coast, is included in the majority population which does not consider itself to belong to an ethnic community. In the 2005 census, there were only 19,890 Creoles (0.4% of the total population). The Creole population is mostly of West Indian (Antillean) origin, the descendants of indentured laborers brought mostly from Jamaica when the region was a British protectorate. The Garifuna, a people of mixed Carib, Angolan, Congolese and Arawak descent, numbered 3,271 in 2005 (0.1%). 112,253 people considered themselves "Mestizo de la Costa Caribe" (mestizo of the Caribbean coast). In addition to the inhabitants who declared themselves Indigenous or Ethnic community, 13,740 answered "Other". Another 47,473 responded "Not Sure" and an additional 19,460 responded "Ignore". Indigenous population The Native American population, the unmixed descendants of the country's indigenous inhabitants, numbered 227,760 (4.4% of the total population) in 2005. Nicaragua's pre-Columbian consisted of many indigenous groups. In the western region, the Nahua people (also known as the Pipil-Nicaraos) were present along with other groups such as the Chorotega people. The central region and the Caribbean coast of Nicaragua were inhabited by indigenous peoples who were mostly Chibcha-related groups that had migrated from South America, primarily present day Colombia and Venezuela. These groups include the Miskitos (120,817 people), Matagalpa (15,240 people), Ramas (4,185 people), Sumos (9,756 people) and Ulwa (698 people). Other indigenous peoples include the Subtiaba (19,949 people) and modern-day Chorotegas who are also known as the Mangue (46,002 people). In the 19th century, there was still a substantial indigenous minority, but this group was largely assimilated culturally into the mestizo majority. In the mid-1980s, the government divided the department of Zelaya – consisting of the eastern half of the country — into two autonomous regions and granted the black and indigenous people of this region limited self-rule within the Republic. Immigration Relative to its overall population, Nicaragua has never experienced any large scale wave of immigrants. The total number of immigrants to Nicaragua, both originating from other Latin American countries and all other countries, never surpassed 1% of its total population prior to 1995. The 2005 census showed the foreign-born population at 1.2%, having risen 0.06% in 10 years. However, in the 19th century, Nicaragua received immigrants from Europe, who established many agricultural businesses such as coffee and sugar cane plantations, and also newspapers, hotels and banks. Emigration During the Nicaraguan Revolution and the Civil War, thousands of Nicaraguans left the country. After the 1990 Nicaraguan Elections some people returned, but many more emigrated during the rest of the decade. In 1998, the Hurricane Mitch killed almost 4,000 people in the country and destroyed much of the Nicaraguan economy, as a result thousands of Nicaraguans received the TPS for emigrate to the United States as "refugees". In recent years, many Nicaraguans had left the country to escape poverty and unemployment. Nicaraguan emigration is a recent process. During the 1990–2004 period, more than 800,000 Nicaraguans left the country, compared to 100,000 during the 1970–1989 period. According to the World Bank, in 2005 there were 683,520 Nicaraguans living outside Nicaragua legally. If those who are undocumented are counted, some sources estimate as many as 1,500,000 Nicaraguans living abroad by the end of 2005. Nicaraguans are the third largest community of Central Americans living abroad, after Guatemalans and Salvadorans. Nicaragua is also the second country in Central America by percentage of population living abroad. Remittances to Nicaragua represent about 15% of the country's GDP. In 2008 Nicaragua received close to one billion dollars in remittances; an increase from the $750,000,000 received in 2007, according to the World Bank Language The official language of Nicaragua | population (01.07.2009) (estimates): Population distribution Ninety percent of Nicaraguans live in the Pacific lowlands and the adjacent interior highlands. The population is 54% urban. The most populous city in Nicaragua is the capital city, Managua, with a population of 1.2 million (2005). As of 2005, over 4.4 million inhabitants live in the Pacific, Central and North regions of the country. There are 2.7 million residents in the Pacific region. The Caribbean region has an estimated 700,000 residents. In addition, many Nicaraguans live abroad. Departments by population Vital statistics Registration of vital events is in Nicaragua not complete. The Population Department of the United Nations prepared the following estimates. Fertility and births Total Fertility Rate (TFR) (Wanted Fertility Rate) and Crude Birth Rate (CBR): Births and deaths Ethnic groups In the 19th century, there had been a substantial indigenous minority, but this group was also largely assimilated culturally into the mestizo majority. Primarily in the 19th century, Nicaragua saw several waves of immigration from other European nations. In particular the northern cities of Estelí, Jinotega and Matagalpa have significant fourth generation Germans. Most of Nicaragua's population lives in the western region of the country in the departments of Managua, Granada and León. According to the 2005 census 443,847 (8.6%) residents consider themselves to belong to an indigenous people or to an ethnic community. The remaining majority of the Nicaraguan population (91.6%) are deemed mestizo and white, with the majority of these being of Spanish, with some German, Italian, Portuguese and French ancestry. Mestizos and whites mainly reside in the western region of the country. Possibly also a part of the black or Afro-Nicaraguan population, which mainly resides on the country's sparsely populated Caribbean (or Atlantic) coast, is included in the majority population which does not consider itself to belong to an ethnic community. In the 2005 census, there were only 19,890 Creoles (0.4% of the total population). The Creole population is mostly of West Indian (Antillean) origin, the descendants of indentured laborers brought mostly from Jamaica when the region was a British protectorate. The Garifuna, a people of mixed Carib, Angolan, Congolese and Arawak descent, numbered 3,271 in 2005 (0.1%). 112,253 people considered themselves "Mestizo de la Costa Caribe" (mestizo of the Caribbean coast). In addition to the inhabitants who declared themselves Indigenous or Ethnic community, 13,740 answered "Other". Another 47,473 responded "Not Sure" and an additional 19,460 responded "Ignore". Indigenous population The Native American population, the unmixed descendants of the country's indigenous inhabitants, numbered 227,760 (4.4% of the total population) in 2005. Nicaragua's pre-Columbian consisted of many indigenous groups. In the western region, the Nahua people (also known as the Pipil-Nicaraos) were present along with other groups such as the Chorotega people. The central region and the Caribbean coast of Nicaragua were inhabited by indigenous peoples who were mostly Chibcha-related groups that had migrated from South America, primarily present day Colombia and Venezuela. These groups include the Miskitos (120,817 people), Matagalpa (15,240 people), Ramas (4,185 people), Sumos (9,756 people) and Ulwa (698 people). Other indigenous peoples include the Subtiaba (19,949 people) and modern-day Chorotegas who are also known as the Mangue (46,002 people). In the 19th century, there was still a substantial indigenous minority, but this group was largely assimilated culturally into the mestizo majority. In the mid-1980s, the government divided the department of Zelaya – consisting of the eastern half of the country — into two autonomous regions and granted the black and indigenous people of this region limited self-rule within the Republic. Immigration Relative to its overall population, Nicaragua has never experienced any large scale wave of immigrants. The total number of immigrants to Nicaragua, both originating from other Latin American countries and all other countries, never surpassed 1% of its total population prior to 1995. The 2005 census showed the foreign-born population at 1.2%, having risen 0.06% in 10 years. However, in the 19th century, Nicaragua received immigrants from Europe, who established many agricultural businesses such as coffee and sugar cane plantations, and also newspapers, hotels and banks. Emigration During the Nicaraguan Revolution and the Civil War, thousands of Nicaraguans left the country. After the 1990 Nicaraguan Elections some people returned, but many more emigrated during the rest of the decade. In 1998, the Hurricane Mitch killed almost 4,000 people in the country and destroyed much of the Nicaraguan economy, as a result thousands of Nicaraguans received the TPS for emigrate to the United States as "refugees". In recent years, many Nicaraguans had left the country to escape poverty and unemployment. Nicaraguan emigration is a recent process. During the 1990–2004 period, more than 800,000 Nicaraguans left the country, compared to 100,000 during the 1970–1989 period. According to the World Bank, in 2005 there were 683,520 Nicaraguans living outside Nicaragua legally. If those who are undocumented are counted, some sources estimate as many as 1,500,000 Nicaraguans living abroad by the end of 2005. Nicaraguans are the third largest community of Central Americans living abroad, after Guatemalans and Salvadorans. Nicaragua is also the second country in Central America by percentage of population living abroad. Remittances |
runner-up in the presidential race, for a total of 92. In the 2011 elections, the Sandinista National Liberation Front won 63 seats (securing a majority), the Independent Liberal Party won 27 seats, and the Constitutionalist Liberal Party won 2 seats. This includes seats given to outgoing Vice President Jaime Morales Carazo and presidential runner-up Fabio Gadea Mantilla. Outgoing Vice President Jaime Morales Carazot's seat would usually be given to the outgoing president. However, Danial Ortega was re-elected after the Constitution was modified to remove term limits. Political parties and elections Judicial branch The Supreme Court of Justice supervises the functioning of the still largely ineffective and overburdened judicial system. As part of the 1995 constitutional reforms, the independence of the Supreme Court was strengthened by increasing the number of magistrates from 9 to 12. In 2000, the number of Supreme Court Justices was increased to 16. Supreme Court justices are nominated by the political parties and elected to 5-year terms by the National Assembly. Electoral branch Led by a council of seven magistrates, the Supreme Electoral Council (CSE) is the co-equal branch of government responsible for organizing and conducting elections, plebiscites, and referendums. The magistrates and their alternates are elected to 5-year terms by the National Assembly. Constitutional changes in 2000 expanded the number of CSE magistrates from five to seven and gave the PLC and the FSLN a freer hand to name party activists to the council, prompting allegations that both parties were politicizing electoral institutions and processes and excluding smaller political parties. Human rights Freedom of speech is a right guaranteed by the Nicaraguan constitution, but media has come under censorship from time to time. Other constitutional freedoms include peaceful assembly and association, freedom of religion, and freedom of movement within the country, as well as foreign travel, emigration, and repatriation. The government also permits domestic and international human rights monitors to operate freely in Nicaragua. The constitution prohibits discrimination based on birth, nationality, political belief, race, gender, language, religion, opinion, national origin, economic or social condition. Homosexuality has been legal since 2008. | Segovia, Rivas, Río San Juan, as well as in two autonomous regions: North Caribbean Coast Autonomous Region and South Caribbean Coast Autonomous Region. Foreign relations Nicaragua President Daniel Ortega said March 6, 2008 that the nation is breaking relations with Colombia "in solidarity with the Ecuadoran people", following the 2008 Andean diplomatic crisis. The relations were restored soon after. Political pressure groups Some political pressure groups are: National Workers Front or FNT is a Sandinista umbrella group of eight labor unions, including Farm Workers Association or ATC Health Workers Federation or FETSALUD Heroes and Martyrs Confederation of Professional Associations or CONAPRO National Association of Educators of Nicaragua or ANDEN National Union of Employees or UNE National Union of Farmers and Ranchers or UNAG Sandinista Workers' Centre or CST Union of Journalists of Nicaragua or UPN Permanent Congress of Workers or CPT is an umbrella group of four non-Sandinista labor unions, including Autonomous Nicaraguan Workers Central or CTN-A Confederation of Labour Unification or CUS Independent General Confederation of Labor or CGT-I Labor Action and Unity Central or CAUS Nicaraguan Workers' Central or CTN is an independent labor union Superior Council of Private Enterprise or COSEP is a confederation of business groups See also 2013-2019 Nicaraguan protests References External links National Assembly of Nicaragua Presidency of Nicarágua Supreme Court of Nicarágua Government of Nicaragua South Caribbean Coast Autonomous Region North Caribbean Coast |
a high debt-service burden, leaving it highly dependent on foreign assistance—which represented almost 25% of GDP in 2001. One of the key engines of economic growth has been production for export. Although traditional products such as coffee, meat, and sugar continued to lead the list of Nicaraguan exports, the fastest growth is now in nontraditional exports: textile and apparel; gold; seafood; and new agricultural products such as peanuts, sesame, melons, and onions. In 2007, exports topped US$1 billion for the first time in Nicaraguan history. Nicaragua is primarily an agricultural country, but construction, mining, fisheries, and general commerce also have been expanding during the last few years. Foreign private capital inflows topped $300 million in 1999 but, due to economic and political uncertainty, fell to less than $100 million in 2001. In the last 12 years, tourism has grown 394%, the rapid growth has led it to become Nicaragua's second largest source of foreign capital. Less than three years ago, the nation's tourism budget was U.S. $400,000; today, it is over $2 million. Nicaragua's economy has also produced a construction boom, the majority of which is in and around Managua. Nicaragua faces a number of challenges in stimulating rapid economic growth. An International Monetary Fund (IMF) program is currently being followed, with the aim of attracting investment, creating jobs, and reducing poverty by opening the economy to foreign trade. This process was boosted in late 2000 when Nicaragua reached the decision point under the Heavily Indebted Poor Countries (HIPC) debt relief initiative. However, HIPC benefits were delayed because Nicaragua subsequently fell "off track" from its IMF program. The country also has been grappling with a string of bank failures that began in August 2000. Moreover, Nicaragua continues to lose international reserves due to its growing fiscal deficits. The country is still a recovering economy and it continues to implement further reforms, on which aid from the IMF is conditional. In 2005, finance ministers of the leading eight industrialized nations (G8) agreed to forgive some of Nicaragua's foreign debt, as part of the HIPC program. According to the World Bank Nicaragua's GDP was around $4.9 US billion dollars. Recently, in March 2007, Poland and Nicaragua signed an agreement to write off $30.6 million which was borrowed by the Nicaraguan government in the 1980s. The U.S. is the country's largest trading partner, providing 25% of Nicaragua's imports and receiving about 60% of its exports. About 25 wholly or partly owned subsidiaries of U.S. companies operate in Nicaragua. The largest of those operations are in the energy, communications, manufacturing, fisheries, and shrimp farming sectors. Opportunities exist for expanded foreign investments in those sectors, as well as in tourism, mining, franchising, and the distribution of imported consumer, manufacturing, and agricultural goods. There also are copper mines in northeastern Nicaragua. Gross Domestic Product (GDP) in purchasing power parity (PPP) in 2012 was estimated at US$20.04 billion, and GDP per capita in PPP at US$3,300, making Nicaragua the second poorest country in the Western Hemisphere. The service sector is the largest component of GDP at 56.7%, followed by the industrial sector at 25.8%(2012). Agriculture represents 17.5% | largest trading partner, providing 25% of Nicaragua's imports and receiving about 60% of its exports. About 25 wholly or partly owned subsidiaries of U.S. companies operate in Nicaragua. The largest of those operations are in the energy, communications, manufacturing, fisheries, and shrimp farming sectors. Opportunities exist for expanded foreign investments in those sectors, as well as in tourism, mining, franchising, and the distribution of imported consumer, manufacturing, and agricultural goods. There also are copper mines in northeastern Nicaragua. Gross Domestic Product (GDP) in purchasing power parity (PPP) in 2012 was estimated at US$20.04 billion, and GDP per capita in PPP at US$3,300, making Nicaragua the second poorest country in the Western Hemisphere. The service sector is the largest component of GDP at 56.7%, followed by the industrial sector at 25.8%(2012). Agriculture represents 17.5% of GDP and it's the largest percentage in a Central American country. Nicaraguan labor force is estimated at 2.961 million of which 28% is occupied in agriculture, 19% in the industry sector and 53% in the service sector (2012). Agriculture and food production Coffee became Nicaragua's principal crop in the 1870s, a position it still held in 1992 despite the growing importance of other crops. Cotton gained importance in the late 1940s, and in 1992 was the second biggest export earner. In the early 20th century, Nicaraguan governments were reluctant to give concessions to the large United States banana companies, and bananas never attained the level of prominence in Nicaragua that they reached in Nicaragua's Central American neighbors; bananas were grown in the country, however, and were generally the third largest export earner in the post-World War II period. Beef and animal byproducts, the most important agricultural export for the three centuries before the coffee boom of the late 19th century, were still important commodities in 1992. From the end of World War II to the early 1960s, the growth and diversification of the agricultural sector drove the nation's economic expansion. From the early 1960s until the increased fighting in 1977 caused by the Sandinista revolution, agriculture remained a robust and significant part of the economy, although its growth slowed somewhat in comparison with the previous postwar decades. Statistics for the next fifteen years, however, show stagnation and then a drop in agricultural production. The agricultural sector declined precipitously in the 1980s. Until the late 1970s, Nicaragua's agricultural export system generated 40 percent of the country's GDP, 60 percent of national employment, and 80 percent of foreign exchange earnings. Throughout the 1980s, the Contras destroyed or disrupted coffee harvests as well as other key income-generating crops. Private industry stopped investing in agriculture because of uncertain returns. Land was taken out of production of export crops to expand plantings of basic grain. Many coffee plants succumbed to disease. In 1989, the fifth successive year of decline, farm production declined by roughly 7 percent in comparison with the previous year. Production of basic grains fell as a result of Hurricane Joan in 1988 and a drought in 1989. By 1990 agricultural exports had declined to less than half the level of 1978. The only bright spot was the production of nontraditional export crops such as sesame, tobacco, and African palm oil. Services The service sector was estimated to account for 56.8% of the country's GDP, and employs 52% of the active population. This section includes transportation, commerce, warehousing, restaurant and hotels, arts and entertainment, health, education, financial and banking services, telecommunications as well as public administration and defense. Tourism in Nicaragua is one of the most important industries in the country. It is the second largest source of foreign exchange for the country and is predicted to become the first largest industry in 2007. The growth in tourism has positively affected the agricultural, commercial, finance, and construction industries as well. Current economic outlook Nicaragua has ratified Free Trade Agreements with major markets such as the United States, the Dominican Republic (DR-CAFTA), Taiwan and Mexico, among others. As evidence of continuous efforts in improving the business climate, Nicaragua has been ranked favorably in a variety of independent evaluations. The 2011 Doing Business Report, published by The World Bank Group, a report that benchmarks various indicators of the investment climate in 183 nations, ranked Nicaragua as the top location in Central America in starting a business, investor protection, and closing a business. Additionally, the country improved in the following categories: ease of doing business, registering property, paying taxes, trading across borders and enforcing contracts. Data The following table shows the main economic indicators in 1980–2020 (with IMF staff stimtates in 2021–2026). Inflation below 5% is in green. Other statistics Household income or consumption by percentage share: lowest 10%: 1.4%; highest 10%: 41.8 (2005) Industrial production growth rate: 2.4% (2005) Electricity - production: 2.778 billion kWh (2006) Electricity - production by source: fossil fuel: 53.43%; hydro: 35.34%; nuclear: 0%; other: 11.23% (1998). A large number of wind turbines have been installed along the SW shore of Lake Nicaragua since, and some geothermal plants have been constructed as well. As of 2013, the breakdown was: fossil fuel: 50%; wind power: 15%; geothermal: 16%, hydropower: 12%, biomass power: 7%. Electricity - consumption: 2.929 billion kWh (2006) Electricity - exports: 69.34 million kWh (2006) Electricity - imports: 0 kWh (2006) Agriculture - products: coffee, bananas, sugarcane, cotton, rice, corn, tobacco, sesame, soya, beans; beef, veal, pork, poultry, dairy products; shrimp, lobsters Exports - commodities: coffee, beef, shrimp and lobster, cotton, tobacco, peanuts, sugar, bananas; gold Imports - commodities: consumer goods, machinery and equipment, raw materials, petroleum products Currency: 1 gold Cordoba (C$) = 100 |
digital technology, owing to investments since privatization of the formerly state-owned telecommunications company; since privatization, access to fixed-line and mobile-cellular services has improved; fixed-line teledensity roughly 5 per 100 persons; mobile-cellular telephone subscribership has increased to roughly 85 per 100 persons (2011). Satellite earth stations: 1 Intersputnik (Atlantic Ocean region) and 1 Intelsat (Atlantic Ocean) (2011). Communications cables: Americas Region Caribbean Ring System (ARCOS-1) fiber optic submarine cable provides connectivity to South and Central America, parts of the Caribbean, and the US (2011). Internet Top-level domain: .ni Internet users: 773,240 users, 121st in the world; 13.5% of the population, 159th in the world (2012). Fixed broadband: 95,023 subscriptions, 102nd in the world; 1.7% of the population, 131st in the world (2012). Wireless broadband: 58,365 subscriptions, 123rd in the world; 1.0% of the population, 133rd in the world (2012). Internet hosts: 296,068 hosts, 63rd in the world (2012). IPv4: 369,408 addresses allocated, less than 0.05% of the world total, 64.5 addresses per 1000 people (2012). Internet Service Providers: 5 ISPs (1999); cable internet in widespread use; DSL and WAP available in major cities. Internet censorship and surveillance There are no government restrictions on access to the Internet or Internet chat rooms; however, several NGOs claim the government monitors their e-mail. Individuals and groups engage in the expression of views via the Internet, including by e-mail and social media. The constitution provides for freedom of speech and press, but the government used administrative, judicial, and financial means to limit the exercise of these rights. Although the law provides that the right to information cannot be subjected to censorship, it also establishes retroactive liability, including criminal penalties for libel and slander. During the November 2012 municipal elections, a popular Web site that allowed voters to register complaints or allegations of election fraud was apparently hacked on several occasions and forced to shut for significant portions of the day. Certain NGOs claimed the Web site was tampered with to prevent dissemination of voter complaints. During 2012 there were several reported cases of threats and violence against the press. On December 11, the spokesman of the Supreme Court of Justice publicly accused the online newsweekly Confidential of being financed by narcotics trafficking organizations, an allegation rights groups said was politically motivated. See also Media | supplemented by cable TV in most urban areas (2007). Television sets: 320,000 (1997). Media restrictions Independent media are active and express a variety of views. The government, however, restricts media freedom through harassment, censorship, arbitrary application of libel laws, and use of national security justifications. Private individuals also harass media for criticizing the government. President Ortega frequently uses a law that allows for government broadcasts of emergency messages to force national networks either to broadcast his speeches or to cease other programming temporarily during those times. The government continues to close opposition radio stations and cancel opposition television programs, allegedly for political reasons. It also employs vandalism, the seizure of privately owned broadcast equipment, and criminal defamation charges against media outlet owners or program hosts to limit freedom and diversity of the press. Opposition news sources report that generally they were not permitted to enter official government events and are denied interviews by government officials. In June 2012 the Nicaraguan Association for Human Rights (ANPDH) claimed that the Nicaraguan National Police (NNP) forcibly closed Somoto-based Television Channel 13 due to the station's reporting on government corruption. The owner of the station, Juan Carlos Pineda, claimed that NNP officials harassed and threatened him prior to the forced closure. There were no reports of an investigation, and at the end of 2012 the station remained closed. The Communications Research Centre of Nicaragua (CINCO) reported that control over television media by the Sandinista National Liberation Front (FSLN) and President Ortega strengthened throughout 2012. National television was increasingly either controlled by FSLN supporters or directly owned and administered by President Ortega's family members. Eight of the nine basic channels available were under direct FSLN influence. In general media outlets owned by the presidential family limited programming to progovernment or FSLN propaganda and campaign advertisements. Press and human rights organizations claimed that the use of state funds for official media placed opposition outlets at an unfair disadvantage. Some journalists practice self-censorship, fearing economic and physical repercussions for investigative reporting on crime or official corruption. Telephones Calling code: +505 International call prefix: 00 Main lines: 320,000 lines in use, 112th |
of public transport, in 2016 the OpenStreetMap group in Nicaragua MapaNica crowdsourced with the help of more than 150 citizens of Managua the first bus transit map in whole Central America. Later in 2018, they made this data machine-accessible, serving it today in different apps on several platforms. Urban buses in Managua will be use Brazilian dual-mode bus and hybrid electric bus that are currently in evaluation process. Suburban buses Suburban buses (Suburbanos) connect larger cities with communities in outer areas. They only stop a few times inside the city, later nearly everywhere where passengers request to get off. Like with urban buses, a team serves a route several times per day and the service is organized by the local government. Prices can vary depending on the distance. Ruteados Connecting two or more cities, Ruteados (also called Servicio Ordinario) are the biggest part of bus services in Nicaragua. Express buses Express buses (Expresos) connect, like Ruteados and share taxis, two or more cities, but with less stops, resulting in a faster travel time. Share taxis Share taxis are called Interlocales in Nicaragua and also connect two or more cities, like Ruteados and express buses, with the main difference that they depart from the bus station once they are filled either mostly or completely with passengers. Like express buses, they nearly don't stop between start and destination. Air transport Several airports are serving both national and international flights. Airports As of 2013, 147 airports exist in Nicaragua. Nicaragua's main international airport is Managua International Airport. Airports - with paved runways In total, there are 12 airports with paved runways with the following lengths: 2,438 to 3,047 m: 3 1,524 to 2,437 m: 2 914 to 1,523 m: 3 under 914 m: 4 Airports - with unpaved runways In total, there are 135 airports with unpaved runways with the following lengths: 1,524 to 2,437 m: 1 914 to 1,523 m: 15 under 914 m: 119 Water transport Nicaragua offers 2,220 km of water transport roads, including the two large lakes Lake Nicaragua and Lake Managua. A Nicaragua Canal was planned but canceled on 21 February 2018. Ports and harbors Atlantic Ocean (Caribbean) Bluefields El Bluff Puerto Cabezas Pacific | town is organizing it on their own behave. In Estelí every bus driver is assisted by mostly two persons helping them (Ayudantes). Bus drivers in Managua have to manage their job on their own. Another fact that heavily differs are the vehicles used in the different cities. In Managua mostly urban buses sponsored by Russia are used, in Estelí former school buses from the United States, in Bluefields Japanese light commercial vans and in León pickup trucks that got extended with seats and a roof. The quality of bus stops also heavily differs. In the center of Managua many proper bus stops exist with roofs or at least signs, in other areas there often isn't any indication of a bus stop. Nevertheless, buses serve a network of established stops with common names known by bus assistants. Passengers need to know or ask where and when which bus stops. To improve the accessibility of public transport, in 2016 the OpenStreetMap group in Nicaragua MapaNica crowdsourced with the help of more than 150 citizens of Managua the first bus transit map in whole Central America. Later in 2018, they made this data machine-accessible, serving it today in different apps on several platforms. Urban buses in Managua will be use Brazilian dual-mode |
was later advanced to the first half of 1995. The army reform measures were launched with deep cuts in personnel strengths, the abolition of conscription, and disbanding of the militia. The size of the army declined from a peak strength of 97,000 troops to an estimated 15,200 in 1993, accomplished by voluntary discharges and forced retirements. Under the Sandinistas, the army general staff embodied numerous branches and directorates artillery, combat readiness, communications, Frontier Guards, military construction, intelligence, counterintelligence, training, operations, organization and mobilization, personnel, and logistics. Most of these bodies appear to have been retained, although they have been trimmed and reorganized. The Nicaraguan Air Force and Navy were also subordinate to the army general staff. Since 1990 the mission of the EPS has been to ensure the security of the national borders and to deal with internal disturbances. Its primary task has been to prevent disorder and violence wrought by armed bands of former Contra and Sandinista soldiers. In November and December 1992, the EPS was deployed alongside the National Police to prevent violence during demonstrations by the National Workers' Front for improved pay and benefits. The EPS and the Frontier Guards also assist the police in narcotics control. A small EPS contingent works alongside demobilized Contras in a Special Disarmament Brigade to reduce the arsenal of weapons in civilian hands. National Army of Nicaragua, 1995–2006 In 1995, the National Army of Nicaragua (Ejército de Nicaragua), having never previously been fully apolitical evolved, through constitutional reforms, into a more traditional Central American military. As ties to the FSLN weakened, military leaders turned over power regularly without “fuss,” refrained from becoming involved in the political realm, and the overall size of the military significantly decreased. National Army of Nicaragua, 2006–present Under President Ortega, multiple changes have occurred strengthening FSLN control over the national military. During 2010, the national assembly “passed changes that allowed [the] politicization of the country’s security forces, while expanding these agencies’ domestic powers.” This change effectively erased the shift towards being an apolitical force from 1995 to 2006. Then in 2014, President Ortega supported a constitutional reform removing the defense and governance ministries “from the security forces’ chain of command, reducing oversight and leaving [President] Ortega in charge of appointing military and police commanders.” This action enhanced President Ortega’s political and personal control over the nation’s security forces and personnel. President Ortega has also strengthened his ability to control the general population through two different national security initiatives. In 2015, the Sovereign Security Law, “erased barriers between internal and external security, and gave the Ortega government wide discretion to use coercion against any person or entity deemed a threat to the state, society, or economy.” The Sovereign Security Law provided the Ortega administration the right to infringe upon the basic human rights protected in the Nicaraguan constitution, if deemed necessary. Also, CPCs “have been replaced by Family, Community, and Life Cabinets (Gabinetes).” These cabinets are linked to the police and provide the government with a means to keep communities under constant surveillance. In the contemporary period, multiple changes have taken place in the military regarding purpose and structure. The military currently serves as a force for national defense, public security, civil defense, and national development. In 2014, an expansion of institutional powers granted the military with the opportunity for greater involvement in international security initiatives. The National Army of Nicaragua also has the highest public approval ratings of any Nicaraguan institution. Army equipment Light equipment Degtyaryov machine gun Makarov PM M1911 pistol Smith & Wesson Model 10 Browning Hi-Power Glock 17 Jericho 941 Heckler & Koch MP5 PPSh-41 IMI Uzi IMI Mini Uzi FN FAL Heckler & Koch G3 AK-74MS Type 58 rifle Type 56 assault rifle Pistol Mitralieră model 1963/1965 Romanian RPK version of the MD. 63 is called the MD. 64 Pistol Mitralieră model 1990 Puşcă Mitralieră model 1964 ("model 1964 light machine gun") AIM/AIMS AIM - 7.62×39mm PM.md.65 with cleaning rod removed – 7.62×39mm. An early version of the AIMS with an under folding stock and inward curved grip AIMS - 7.62×39mm AIMS with 75-round drum magazine - 7.62×39mm AIMR First model AIMR with 20-round magazine – 7.62×39mm. The original Romanian designation for this rifle is the PM md. 80 AIMR – 7.62×39mm. The original Romanian designation for this rifle is the PM md. 90 cu țeavă scurtă (short barrelled) AIMR – 5.56×45mm. The original Romanian designation for this rifle is the PA md. 97 cu țeavă scurtă (short barrelled) AIMR – 5.56×45mm. The original Romanian designation for this rifle is the PA md. 97 cu țeavă scurtă (short barrelled) Romanian AK Draco Pistol - 7.62×39mm. This is a US import variant of the AIMR and can be identified by its lack of a stock, a plain hand guard without palmswell and 2 position selector switch Romanian AK Draco Carbine - 7.62×39mm. This is a Draco pistol fitted with an AIMS folding stock to replicate the original AIMR, however it still lacks the palmswell hand guard and 3 position selector switch PM md. 80 Pistol Mitralieră model 1980 PM md. 90Pistol Mitralieră model 1990 AK-103 Used by Nicaraguan Special Forces. AK-47 Type I AK-47, hybrid stamped/milled receiver with prototype slab sided magazine - 7.62×39mm Type II AK-47 (note stock mounting bracket) with prototype slab sided magazine - 7.62×39mm Type II AK-47 - 7.62×39mm Type III AK-47 with prototype slab-sided magazine - 7.62×39mm AKM AKMS / MPiKMS AKMSK Zastava M70 Zastava M-70A – milled receiver, underfolding stock M-70A1 – milled receiver, underfolding stock, mount for night or optical sights M-70B1 – stamped receiver, fixed stock M-70AB2 – stamped receiver, underfolding stock M-70B1N – stamped receiver, fixed stock, mount for night or optical sights M-70AB2N – stamped receiver, underfolding stock, mount for night or optical sights M-70AB3 – stamped receiver, underfolding stock, rifle grenade sight removed and replaced with a BGP 40mm underslung grenade launcher M-70B3 – stamped receiver, fixed stock, rifle grenade sight removed and replaced with a BGP 40mm underslung grenade launcher M-92 – carbine, the shorter variant of the M-70AB2 PAP M-70 – semi-automatic variant intended for the civilian market MPi-KM/MPi-KMS-72 MPi-KMS East German MPi-KM-72 with fixed stock – 7.62×39mm. This was the transitional MPi-KM-72 that still used the wooden lower hand grip from the MPi-KM. These were common from 1965 to 1972. The Side folding stock was not widely distributed until 1973 East German MPi-KM with plastic stock – 7.62×39mm East German MPi-KMS-72 with sling and side-folding stock – 7.62×39mm M-92 – carbine, the shorter variant of the M-70AB2 PAP M-70 – semi-automatic variant intended for the civilian market AK-74 – Assault rifle AKS-74 – Side-folding stock AK-74N (AKS-74N) – Night scope rail AKS-74U – Compact carbine AKS-74UN – Night scope rail AK-63 AK-63F (AMM in Hungarian service): The basic fixed-stock copy of the Soviet AKM. AK-63D (AMMSZ in Hungarian service): An AKMS copy with an under-folding steel stock. AK-63MF: Modernised AK-63D with telescopic stock and MIL-STD-1913 Picatinny rail. SA-85M: A semi-automatic-only version intended for civilian sales in the United States; imported by Kassnar in both pre- and post-ban versions. IMI Galil – 10,000 IMI Micro Galil IMI Micro Galil IMI MAR Galil IMI SAR Galil IMI ARM Galil IMI AR Galil T65 M16A1 & M16A2 rifle – 6,000 SIG SG 540 Ithaca 37 Remington-870 shotgun M67 grenade M59 grenade M34 grenade M26A1 grenade AN M14 AN M18 M79 grenade launcher – 64 Heckler & Koch HK69A1 / MZP-1 FAMAE SAF – Standard and mini-versions HK MP5 sub-machine guns RPK RPKS (folding stock) RPKS-74M RPKS-74 RPKS-74N RPKSN RPK-74m RPKN RPD RPK(S)N night scope rail RPK(S)Lflash suppressor& night scope rail RPKM (modernized) RPK-203 (export variant) RPK-204 (7.62×51mm NATO) AGS-17 Plamya AGS-30 Atlant light automatic grenade launcher Armoured vehicles T-72 – MBT – 20 T-72Bs delivered 2016 T-54/55 – 62 – 156 delivered (20 T-54 & 136 T-55) some via Bulgaria | very limited. There were no Ministry of Defense offices and no vice ministers to shape national defense policies or exercise civilian control over the armed forces. Under the Law of Military Organization of the Sandinista Popular Army enacted just before Chamorro's election victory, Humberto Ortega retained authority over promotions, military construction, and force deployments. He contracted for weapons procurement and drafted the military budget presented to the government. Only an overall budget had to be submitted to the legislature, thus avoiding a line-item review by the National Assembly. Sandinista officers remained at the head of all general staff directorates and military regions. The chief of the army, Major General Joaquín Cuadra Lacayo, continued in his pre-Chamorro position. Facing domestic pressure to remove Humberto Ortega and the risk of curtailment of United States aid as long as Sandinistas remained in control of the armed forces, Chamorro announced that Ortega would be replaced in 1994. Ortega challenged her authority to relieve him and reiterated his intention to remain at the head of the EPS until the army reform program was completed in 1997. This date was later advanced to the first half of 1995. The army reform measures were launched with deep cuts in personnel strengths, the abolition of conscription, and disbanding of the militia. The size of the army declined from a peak strength of 97,000 troops to an estimated 15,200 in 1993, accomplished by voluntary discharges and forced retirements. Under the Sandinistas, the army general staff embodied numerous branches and directorates artillery, combat readiness, communications, Frontier Guards, military construction, intelligence, counterintelligence, training, operations, organization and mobilization, personnel, and logistics. Most of these bodies appear to have been retained, although they have been trimmed and reorganized. The Nicaraguan Air Force and Navy were also subordinate to the army general staff. Since 1990 the mission of the EPS has been to ensure the security of the national borders and to deal with internal disturbances. Its primary task has been to prevent disorder and violence wrought by armed bands of former Contra and Sandinista soldiers. In November and December 1992, the EPS was deployed alongside the National Police to prevent violence during demonstrations by the National Workers' Front for improved pay and benefits. The EPS and the Frontier Guards also assist the police in narcotics control. A small EPS contingent works alongside demobilized Contras in a Special Disarmament Brigade to reduce the arsenal of weapons in civilian hands. National Army of Nicaragua, 1995–2006 In 1995, the National Army of Nicaragua (Ejército de Nicaragua), having never previously been fully apolitical evolved, through constitutional reforms, into a more traditional Central American military. As ties to the FSLN weakened, military leaders turned over power regularly without “fuss,” refrained from becoming involved in the political realm, and the overall size of the military significantly decreased. National Army of Nicaragua, 2006–present Under President Ortega, multiple changes have occurred strengthening FSLN control over the national military. During 2010, the national assembly “passed changes that allowed [the] politicization of the country’s security forces, while expanding these agencies’ domestic powers.” This change effectively erased the shift towards being an apolitical force from 1995 to 2006. Then in 2014, President Ortega supported a constitutional reform removing the defense and governance ministries “from the security forces’ chain of command, reducing oversight and leaving [President] Ortega in charge of appointing military and police commanders.” This action enhanced President Ortega’s political and personal control over the nation’s security forces and personnel. President Ortega has also strengthened his ability to control the general population through two different national security initiatives. In 2015, the Sovereign Security Law, “erased barriers between internal and external security, and gave the Ortega government wide discretion to use coercion against any person or entity deemed a threat to the state, society, or economy.” The Sovereign Security Law provided the Ortega administration the right to infringe upon the basic human rights protected in the Nicaraguan constitution, if deemed necessary. Also, CPCs “have been replaced by Family, Community, and Life Cabinets (Gabinetes).” These cabinets are linked to the police and provide the government with a means to keep communities under constant surveillance. In the contemporary period, multiple changes have taken place in the military regarding purpose and structure. The military currently serves as a force for national defense, public security, civil defense, and national development. In 2014, an expansion of institutional powers granted the military with the opportunity for greater involvement in international security initiatives. The National Army of Nicaragua also has the highest public approval ratings of any Nicaraguan institution. Army equipment Light equipment Degtyaryov machine gun Makarov PM M1911 pistol Smith & Wesson Model 10 Browning Hi-Power Glock 17 Jericho 941 Heckler & Koch MP5 PPSh-41 IMI Uzi IMI Mini Uzi FN FAL Heckler & Koch G3 AK-74MS Type 58 rifle Type 56 assault rifle Pistol Mitralieră model 1963/1965 Romanian RPK version of the MD. 63 is called the MD. 64 Pistol Mitralieră model 1990 Puşcă Mitralieră model 1964 ("model 1964 light machine gun") AIM/AIMS AIM - 7.62×39mm PM.md.65 with cleaning rod removed – 7.62×39mm. An early version of the AIMS with an under folding stock and inward curved grip AIMS - 7.62×39mm AIMS with 75-round drum magazine - 7.62×39mm AIMR First model AIMR with 20-round magazine – 7.62×39mm. The original Romanian designation for this rifle is the PM md. 80 AIMR – 7.62×39mm. The original Romanian designation for this rifle is the PM md. 90 cu țeavă scurtă (short barrelled) AIMR – 5.56×45mm. The original Romanian designation for this rifle is the PA md. 97 cu țeavă scurtă (short barrelled) AIMR – 5.56×45mm. The original Romanian designation for this rifle is the PA md. 97 cu țeavă scurtă (short barrelled) Romanian AK Draco Pistol - 7.62×39mm. This is a US import variant of the AIMR and can be identified by its lack of a stock, a plain hand guard without palmswell and 2 position selector switch Romanian AK Draco Carbine - 7.62×39mm. This is a Draco pistol fitted with an AIMS folding stock to replicate the original AIMR, however it still lacks the palmswell hand guard and 3 position selector switch PM md. 80 Pistol Mitralieră model 1980 PM md. 90Pistol Mitralieră model 1990 AK-103 Used by Nicaraguan Special Forces. AK-47 Type I AK-47, hybrid stamped/milled receiver with prototype slab sided magazine - 7.62×39mm Type II AK-47 (note stock mounting bracket) with prototype slab sided magazine - 7.62×39mm Type II AK-47 - 7.62×39mm Type III AK-47 with prototype slab-sided magazine - 7.62×39mm AKM AKMS / MPiKMS AKMSK Zastava M70 Zastava M-70A – milled receiver, underfolding stock M-70A1 – milled receiver, underfolding stock, mount for night or optical sights M-70B1 – stamped receiver, fixed stock M-70AB2 – stamped receiver, underfolding stock M-70B1N – stamped receiver, fixed stock, mount for night or optical sights M-70AB2N – stamped receiver, underfolding stock, mount for night or optical sights M-70AB3 – stamped receiver, underfolding stock, rifle grenade sight removed and replaced with a BGP 40mm underslung grenade launcher M-70B3 – stamped receiver, fixed stock, rifle grenade sight removed and replaced with a BGP 40mm underslung grenade launcher M-92 – carbine, the shorter variant of the M-70AB2 PAP M-70 – semi-automatic variant intended for the civilian market MPi-KM/MPi-KMS-72 MPi-KMS East German MPi-KM-72 with fixed stock – 7.62×39mm. This was the transitional MPi-KM-72 that still used the wooden lower hand grip from the MPi-KM. These were common from 1965 to 1972. The Side folding stock was not widely distributed until 1973 East German MPi-KM with plastic stock – 7.62×39mm East German MPi-KMS-72 with sling and side-folding stock – 7.62×39mm M-92 – carbine, the shorter variant of the M-70AB2 PAP M-70 – semi-automatic variant intended for the civilian market AK-74 – Assault rifle AKS-74 – Side-folding stock AK-74N (AKS-74N) – Night scope rail AKS-74U – Compact carbine AKS-74UN – Night scope rail AK-63 AK-63F (AMM in Hungarian service): The basic fixed-stock copy of the Soviet AKM. AK-63D (AMMSZ in Hungarian service): An AKMS copy with an under-folding steel stock. AK-63MF: Modernised AK-63D with telescopic stock and MIL-STD-1913 Picatinny rail. SA-85M: A semi-automatic-only version intended for civilian sales in the United States; imported by Kassnar in both pre- and post-ban versions. IMI Galil – 10,000 IMI Micro Galil IMI Micro Galil IMI MAR Galil IMI SAR Galil IMI ARM Galil IMI AR Galil T65 M16A1 & M16A2 rifle – 6,000 SIG SG 540 Ithaca 37 Remington-870 shotgun M67 |
(CARICOM) Association of Caribbean States (ACS) Community of Latin American and Caribbean States (CELAC) Latin American Economic System (SELA) Central American Integration System (SICA) International disputes Territorial disputes with Colombia over the Archipelago de San Andres y Providencia and Quita Sueno Bank with respect to the maritime boundary question in the Golfo de Fonseca. The ICJ referred to the line determined by the 1900 Honduras-Nicaragua Mixed Boundary Commission and advised that some tripartite resolution among El Salvador, Honduras and Nicaragua likely would be required; Maritime boundary dispute with Honduras in the Caribbean Sea. Nicaragua is sovereign over the Rio San Juan, and by treaty Costa Rica has the right to navigate over part of the river with 'objects of commerce'. A dispute emerged when Costa Rica tried to navigate with armed members of its security forces. International relations with IGOs and countries Nicaragua signed a 3-year Poverty Reduction and Growth Facility (PRGF) with the International Monetary Fund (IMF) in October 2007. As part of the IMF program, the Government of Nicaragua agreed to implement free market policies linked to targets on fiscal discipline, poverty spending, and energy regulation. The lack of transparency surrounding Venezuelan bilateral assistance, channeled through state-run enterprises rather than the official budget, has become a serious issue for the IMF and international donors. On September 10, 2008, with misgivings about fiscal transparency, the IMF released an additional $30 million to Nicaragua, the second tranche of its $110 million PRGF. The flawed municipal elections of November 2008 prompted a number of European donors to suspend direct budget support to Nicaragua, a move that created a | International Labour Organization (ILO) UN Human Rights Commission (UNHRC) Organization of American States (OAS) Non-Aligned Movement (NAM) International Atomic Energy Commission (IAEA) Inter-American Development Bank (IDB) Central American Common Market (CACM) Central American Bank for Economic Integration (CABEI). Bolivarian Alliance for the Americas (ALBA) Caribbean Community (CARICOM) Association of Caribbean States (ACS) Community of Latin American and Caribbean States (CELAC) Latin American Economic System (SELA) Central American Integration System (SICA) International disputes Territorial disputes with Colombia over the Archipelago de San Andres y Providencia and Quita Sueno Bank with respect to the maritime boundary question in the Golfo de Fonseca. The ICJ referred to the line determined by the 1900 Honduras-Nicaragua Mixed Boundary Commission and advised that some tripartite resolution among El Salvador, Honduras and Nicaragua likely would be required; Maritime boundary dispute with Honduras in the Caribbean Sea. Nicaragua is sovereign over the Rio San Juan, and by treaty Costa Rica has the right to navigate over part of the river with 'objects of commerce'. A dispute emerged when Costa Rica tried to navigate with armed members of its security forces. International relations with IGOs and countries Nicaragua signed a 3-year Poverty Reduction and Growth Facility (PRGF) with the International Monetary Fund (IMF) in October 2007. As part of the IMF program, the Government of Nicaragua agreed to implement free market policies linked to targets on fiscal discipline, poverty spending, and energy regulation. The lack of transparency surrounding Venezuelan bilateral assistance, channeled through state-run enterprises rather than the official budget, has become a serious issue for the IMF and international donors. On September 10, 2008, with misgivings about fiscal transparency, the IMF released an additional $30 million to Nicaragua, the second tranche of its $110 million PRGF. The flawed municipal elections of November 2008 prompted a number of European donors to suspend direct budget support to Nicaragua, a move that created a severe budget shortfall for the government. This shortfall, in turn, caused the Government of Nicaragua to fall out of compliance with its PRGF obligations and led to a suspension of PRGF disbursements. The IMF is currently in negotiations with the Government of Nicaragua to reinstate disbursements. Under current president Daniel Ortega, Nicaragua has stayed current with the Central American-Dominican Republic Free Trade Agreement, which entered into force for Nicaragua on April 1, 2006. Nicaraguan exports to the United States, which account for 59% of Nicaragua's total exports, were $1.7 billion in 2008, up 45% from 2005. Textiles and apparel account for 55% of exports to the United States, while automobile wiring harnesses add another |
era also saw the flourishing of Saharan rock art, most notably in the Aïr Mountains, Termit Massif, Djado Plateau, Iwelene, Arakao, Tamakon, Tzerzait, Iferouane, Mammanet and Dabous; the art spans the period from 10,000BC to 100AD and depicts a range of subjects, from the varied fauna of the landscape to depictions of spear-carrying figures dubbed 'Libyan warriors'. Empires and kingdoms in pre-colonial Niger Knowledge of early Nigerien history is limited by the lack of written sources, though it is known that by at least the 5th century BC the territory of modern Niger had become an area of trans-Saharan trade. Led by Tuareg tribes from the north, camels were used as a well-adapted means of transportation through what was now an immense desert. This mobility, which would continue in waves for several centuries, was accompanied with further migration to the south and intermixing between sub-Saharan African and North African populations, as well as the gradual spread of Islam. It was also aided by the Arab invasion of North Africa at the end of the 7th century, which resulted in population movements to the south. Several empires and kingdoms flourished in the Sahel during this era. Their history does not fit easily within the modern boundaries of Niger, which were created during the period of European colonialism; the following adopts a roughly chronological account of the main empires. Mali Empire (1200s–1400s) The Mali Empire was a Mandinka empire founded by Sundiata Keita (r. 1230–1255) in circa 1230 and existed up to 1600. As detailed in the Epic of Sundiata, Mali emerged as a breakaway region of the Sosso Empire, which itself had split from the earlier Ghana Empire. Thereafter Mali defeated the Sosso at the Battle of Kirina in 1235 and then Ghana in 1240. From its heartland around the modern Guinea-Mali border region, the empire expanded considerably under successive kings and came to dominate the Trans-Saharan trade routes, reaching its greatest extent during the rule of Mansa Musa (r. 1312–1337). At this point parts of what are now Niger's Tillabéri Region fell under Malian rule. A Muslim, Mansa Musa performed the hajj in 1324–25 and encouraged the spread of Islam in the empire, though it appears that most ordinary citizens continued to maintain their traditional animist beliefs instead of or alongside the new religion. The empire began declining in the 15th century due to a combination of internecine strife over the royal succession, weak kings, the shift of European trade routes to the coast, and rebellions in the empire's periphery by Mossi, Wolof, Tuareg and Songhai peoples. However a rump Mali kingdom continued to exist until late 1600s. Songhai Empire (1000s–1591) The Songhai Empire was named for its main ethnic group, the Songhai or Sonrai, and was centred on the bend of the Niger River in modern Mali. Songhai began settling this region from the 7th to 9th centuries; by the early 11th century Gao (capital of the former Kingdom of Gao) had become the empire's capital. From 1000 to 1325, the Songhai Empire prospered and managed to maintain peace with the Mali Empire, its powerful neighbour to the west. In 1325 Songhai was conquered by Mali until regaining its independence in 1375. Under king Sonni Ali (r. 1464–1492) Songhai adopted an expansionist policy which reached its apogee during the reign of Askia Mohammad I (r. 1493–1528); at this point the empire had expanded considerably from its Niger-bend heartland, including to the east where much of modern western Niger fell under its rule, including Agadez, which was conquered in 1496. However the empire was unable to withstand repeated attacks from the Saadi Dynasty of Morocco and was decisively defeated at the Battle of Tondibi in 1591; the empire then collapsed into a number of smaller kingdoms. Sultanate of Aïr (1400s–1906) In c. 1449 in the north of what is now Niger, the Sultanate of Aïr was founded by Sultan Ilisawan, based in Agadez. Formerly a small trading post inhabited by a mixture of Hausa and Tuaregs, the sultanate grew rich due to its strategic position on the Trans-Saharan trade routes. In 1515 Aïr was conquered by Songhai, remaining a part of that empire until its collapse in 1591. The following centuries present a somewhat confused picture, though it seems that the sultanate entered a decline marked by internecine wars and clan conflicts. When Europeans began exploring the region in the 19th century much of Agadez lay in ruins, and it was taken over, though with difficulty, by the French (see below). Kanem–Bornu Empire (700s–1700s) To the east, the Kanem–Bornu Empire dominated the region around Lake Chad for much of this period. It was founded by the Zaghawa around the 8th century and based in Njimi, north-east of the lake. The kingdom gradually expanded, especially during the rule of the Sayfawa Dynasty which began in c. 1075 under Mai (king) Hummay. The kingdom reached its greatest extent in the 1200s, largely thanks to the effort of Mai Dunama Dibbalemi (r. 1210–1259), and grew rich from its control of many Trans-Saharan trade routes; much of eastern and south-eastern Niger, notably Bilma and Kaouar, was under Kanem's control in this period. Islam had been introduced to the kingdom by Arab traders from the 11th century, gradually gaining more converts over the following centuries. Attacks by the Bulala people in the late 14th century forced Kanem to shift westwards of Lake Chad, where it became known as the Bornu Empire, ruled from its capital Ngazargamu on the modern Niger-Nigeria border. Bornu prospered during the rule of Mai Idris Alooma (r. circa 1575–1610) and re-conquered much of the traditional lands of Kanem, hence the designation 'Kanem–Bornu' for the empire. By the late 17th century and into the 18th the Bornu kingdom had entered a long period of decline, gradually shrinking back to its Lake Chad heartland, though it remained an important player in the region. Circa 1730–40 a group of Kanuri settlers led by Mallam Yunus left Kanem and founded the Sultanate of Damagaram, centred on the town of Zinder. The sultanate remained nominally subject to the Borno Empire until the reign of Sultan Tanimoune Dan Souleymane in the mid-to-late 19th century, who declared independence and initiated a phase of vigorous expansion. The sultanate managed to resist the advance of the Sokoto Caliphate (see below), but was later captured by the French in 1899. The Hausa states and other smaller kingdoms (1400s–1800s) Between the Niger River and Lake Chad lay various Hausa Kingdoms, encompassing the cultural-linguistic area known as Hausaland which straddles the modern Niger-Nigeria border. The origins of the Hausa are obscure, though they are thought to be a mixture of autochthonous peoples and migrant peoples from the north and/or east, emerging as a distinct people sometime in the 900s–1400s when the kingdoms were founded. They gradually adopted Islam from the 14th century, though often this existed alongside traditional religions, developing into unique syncretic forms; some Hausa groups, such as the Azna, resisted Islam altogether (the area of Dogondoutchi remains an animist stronghold to this day). The Hausa kingdoms were not a compact entity but several federations of kingdoms more or less independent of one other. Their organisation was hierarchical though also somewhat democratic: the Hausa kings were elected by the notables of the country and could be removed by them. The Hausa Kingdoms began as seven states founded, according to the Bayajidda legend, by the six sons of Bawo. Bawo was the only son of the Hausa queen Daurama and Bayajidda or (Abu Yazid according to certain Nigerien historians) who came from Baghdad. The seven original Hausa states (often referred to as the 'Hausa bakwai') were: Daura (state of queen Daurama), Kano, Rano, Zaria, Gobir, Katsina and Biram. An extension of the legend states that Bawo had a further seven sons with a concubine, who went on to the found the so-called 'Banza (illegitimate) Bakwai': Zamfara, Kebbi, Nupe, Gwari, Yauri, Ilorin and Kwararafa. A smaller state not fitting into this scheme was Konni, centred on Birni-N'Konni. The Fulani (also called Peul, Fulbe etc.), a pastoral people found throughout the Sahel, began migrating to Hausaland during the 1200s–1500s. During the later 18th century many Fulani were unhappy with the syncretic form of Islam practised there; exploiting also the populace's disdain with corruption amongst the Hausa elite, the Fulani scholar Usman Dan Fodio (from Gobir) declared a jihad in 1804. After conquering most of Hausaland (though not the Bornu Kingdom, which remained independent) he proclaimed the Sokoto Caliphate in 1809. Some of the Hausa states survived by fleeing south, such as the Katsina who moved to Maradi in the south of modern Niger. Many of these surviving states harassed the Caliphate and a long period of small-scale wars and skirmishes commenced, with some states (such as Katsina and Gobir) maintaining independence, whereas elsewhere new ones were formed (such as the Sultanate of Tessaoua). The Caliphate managed to survive until, fatally weakened by the invasions of Chad-based warlord Rabih az-Zubayr, it finally fell to the British in 1903, with its lands later being partitioned between Britain and France. Other smaller kingdoms of the period include the Dosso Kingdom, a Zarma polity founded in 1750 which resisted the rule of Hausa and Sokoto states. French Niger (1900–58) In the 19th century Europeans began to take a greater interest in Africa; several European explorers travelled in the area of modern Niger, such as Mungo Park (in 1805–06), the Oudney-Denham-Clapperton expedition (1822–25), Heinrich Barth (1850–55; with James Richardson and Adolf Overweg), Friedrich Gerhard Rohlfs (1865–67), Gustav Nachtigal (1869–74) and Parfait-Louis Monteil (1890–92). Several European countries already possessed littoral colonies in Africa, and in the latter half of the century they began to turn their eyes towards the interior of the continent. This process, known as the 'Scramble for Africa', culminated in the 1885 Berlin conference in which the colonial powers outlined the division of Africa into spheres of influence. As a result of this, France gained control of the upper valley of the Niger River (roughly equivalent to the areas of modern Mali and Niger). France then set about making a reality of their rule on the ground. In 1897 the French officer Marius Gabriel Cazemajou was sent to Niger; he reached the Sultanate of Damagaram in 1898 and stayed in Zinder at the court of Sultan Amadou Kouran Daga—however he was later killed as Daga feared he would ally with the Chad-based warlord Rabih az-Zubayr. In 1899–1900 France coordinated three expeditions—the Gentil Mission from French Congo, the Foureau-Lamy Mission from Algeria and the Voulet–Chanoine Mission from Timbuktu—with the aim of linking France's African possessions. The three eventually met at Kousséri (in the far north of Cameroon) and defeated Rabih az-Zubayr's forces at the Battle of Kousséri. The Voulet-Chanoine Mission was marred by numerous atrocities, and became notorious for pillaging, looting, raping and killing many local civilians on its passage throughout southern Niger. On 8 May 1899, in retaliation for the resistance of queen Sarraounia, captain Voulet and his men murdered all the inhabitants of the village of Birni-N'Konni in what is regarded as one of the worst massacres in French colonial history. The brutal methods of Voulet and Chanoine caused a scandal and Paris was forced to intervene; however when Lieutenant-Colonel Jean-François Klobb caught up with the mission near Tessaoua to relieve them of command he was killed. Lt. Paul Joalland, Klobb's former officer, and Lt. Octave Meynier eventually took over the mission following a mutiny in which Voulet and Chanoine were killed. The Military Territory of Niger was subsequently created within the Upper Senegal and Niger colony (modern Burkina Faso, Mali and Niger) in December 1904 with its capital at Niamey, then little more than a large village. The border with Britain's colony of Nigeria to the south was finalised in 1910, a rough delimitation having already been agreed by the two powers via several treaties during the period 1898–1906. The capital of the territory was moved to Zinder in 1912 when the Niger Military Territory was split off from Upper Senegal and Niger, before being moved back to Niamey in 1922 when Niger became a fully-fledged colony within French West Africa. The borders of Niger were drawn up in various stages and had been fixed at their current position by the late 1930s. Various territorial adjustments took place in this period: the areas west of the Niger river were only attached to Niger in 1926–27, and during the dissolution of Upper Volta (modern Burkina Faso) in 1932–47 much of the east of that territory was added to Niger; and in the east the Tibesti Mountains were transferred to Chad in 1931. The French generally adopted a form of indirect rule, allowing existing native structures to continue to exist within the colonial framework of governance providing that they acknowledged French supremacy. The Zarma of the Dosso Kingdom in particular proved amenable to French rule, using them as allies against the encroachments of Hausa and other nearby states; over time the Zarma thus became one of the more educated and westernised groups in Niger. However, perceived threats to French rule, such as the Kobkitanda rebellion in Dosso Region (1905–06), led by the blind cleric Alfa Saibou, and the Karma revolt in the Niger valley (December 1905–March 1906) led by Oumarou Karma were suppressed with force, as were the latter Hamallayya and Hauka religious movements. Though largely successful in subduing the sedentary populations of the south, the French faced considerably more difficulty with the Tuareg in the north (centered on the Sultanate of Aïr in Agadez), and France was unable to occupy Agadez until 1906. Tuareg resistance continued however, culminating in the Kaocen revolt of 1916–17, led by Ag Mohammed Wau Teguidda Kaocen, with backing from the Senussi in Fezzan; the revolt was violently suppressed and Kaocen fled to Fezzan, where he was later killed. A puppet sultan was set up by the French and the decline and marginalisation of the north of the colony continued, exacerbated by a series of droughts. Though it remained something of a backwater, some limited economic development took place in Niger during the colonial years, such as the introduction of groundnut cultivation. Various measures to improve food security following a series of devastating famines in 1913, 1920 and 1931 were also introduced. During the Second World War, during which time mainland France was occupied by Nazi Germany, Charles de Gaulle issued the Brazzaville Declaration, declaring that the French colonial empire would be replaced post-war with a less centralised French Union. The French Union, which lasted from 1946 to 1958, conferred a limited form of French citizenship on the inhabitants of the colonies, with some decentralisation of power and limited participation in political life for local advisory assemblies. It was during this period that the Nigerien Progressive Party (Parti Progressiste Nigérien, or PPN, originally a branch of the African Democratic Rally, or Rassemblement Démocratique Africain – RDA) was formed under the leadership of former teacher Hamani Diori, as well as the left-wing Mouvement Socialiste Africain-Sawaba (MSA) led by Djibo Bakary. Following the Overseas Reform Act (Loi Cadre) of 23 July 1956 and the establishment of the Fifth French Republic on 4 December 1958, Niger became an autonomous state within the French Community. On 18 December 1958, an autonomous Republic of Niger was officially created under the leadership of Hamani Diori. The MSA was banned in 1959 for its perceived excessive anti-French stance. On 11 July 1960, Niger decided to leave the French Community and acquired full independence at midnight, local time, on 3 August 1960; Diori thus became the first president of the country. Independent Niger (1960–present) Diori years (1960–74) For its first 14 years as an independent state Niger was run by a single-party civilian regime under the presidency of Hamani Diori. The 1960s were largely peaceful, and saw a large expansion of the education system and some limited economic development and industrialisation. Links with France remained deep, with Diori allowing the development of French-led uranium mining in Arlit and supporting France in the Algerian War. Relations with other African states were mostly positive, with the exception of Dahomey (Benin), owing to an ongoing border dispute. Niger remained a one-party state throughout this period, with Diori surviving a planned coup in 1963 and an assassination attempt in 1965; much of this activity was masterminded by Djibo Bakary's MSA-Sawaba group, which had launched an abortive rebellion in 1964. In the early 1970s, a combination of economic difficulties, devastating droughts and accusations of rampant corruption and mismanagement of food supplies resulted in a coup d'état that overthrew the Diori regime. First military regime (1974–1991) The coup had been masterminded by Col. Seyni Kountché and a small military group under the name of the Conseil Militaire Supreme, with Kountché going on to rule the country until his death in 1987. The first action of the military government was to address the food crisis. Whilst political prisoners of the Diori regime were released after the coup and the country was stabilised, political and individual freedoms in general deteriorated during this period. There were several attempted coups (in 1975, 1976 and 1984) which were thwarted, their instigators being severely punished. Despite the restriction in freedom, the country enjoyed improved economic development as Kountché sought to create a 'development society', funded largely by the uranium mines in Agadez Region. Several parastatal companies were created, major infrastructure (building and new roads, schools, health centres) constructed, and there was minimal corruption in government agencies, which Kountché did not hesitate to punish severely. In the 1980s Kountché began cautiously loosening the grip of the military, with some relaxation of state censorship and attempts made to 'civilianise' the regime. However the economic boom ended following the collapse in uranium prices, and IMF-led austerity and privatisation measures provoked opposition by many Nigeriens. In 1985 a small Tuareg revolt in Tchintabaraden was suppressed. Kountché died in November 1987 from a brain tumour, and was succeeded by his chief of staff, Col. Ali Saibou, who was confirmed as Chief of the Supreme Military Council four days later. Saibou significantly curtailed the most repressive aspects of the Kountché era (such as the secret police and media censorship), and set about introducing a process of political reform under the overall direction of a single party (the Mouvement National pour la Société du Développement, or MNSD). A Second Republic was declared and a new constitution was drawn up, which was adopted following a referendum in 1989. General Saibou became the first president of the Second Republic after winning the presidential election on 10 December 1989. President Saibou's efforts to control political reforms failed in the face of trade union and student demands to institute a multi-party democratic system. On 9 February 1990, a violently repressed student march in Niamey led to the death of three students, which led to increased national and international pressure for further democratic reform. The Saibou regime acquiesced to these demands by the end of 1990. Meanwhile, trouble re-emerged in Agadez Region when a group of armed Tuaregs attacked the town of Tchintabaraden (generally seen as the start of the first Tuareg Rebellion), prompting a severe military crackdown which led to many deaths (the precise numbers are disputed, with estimates ranging from 70 to up to 1,000). National Conference and Third Republic (1991–1996) The National Sovereign Conference of 1991 marked a turning point in the post-independence history of Niger and brought about multi-party democracy. From 29 July to 3 November, a national conference gathered together all elements of society to make recommendations for the future direction of the country. The conference was presided over by Prof. André Salifou and developed a plan for a transitional government; this was then installed in November 1991 to manage the affairs of state until the institutions of the Third Republic were put into place in April 1993. After the National Sovereign Conference, the transitional government drafted a new constitution that eliminated the previous single-party system of the 1989 Constitution and guaranteed more freedoms. The new constitution was adopted by a referendum on 26 December 1992. Following this, presidential elections were held and Mahamane Ousmane became the first president of the Third Republic on 27 March 1993. Ousmane's presidency was characterised by political turbulence, with four government changes and early legislative elections in 1995, as well a severe economic slump which the coalition government proved unable to effectively address. The violence in Agadez Region continued during this period, prompting the Nigerien government to sign a truce with Tuareg rebels in 1992 which was however ineffective owing to internal dissension within the Tuareg ranks. Another rebellion, led by dissatisfied Toubou peoples claiming that, like the Tuareg, the Nigerien government had neglected their region, broke out in the east of the country. In April 1995 a peace deal with the main Tuareg rebel group was signed, with the government agreeing to absorb some former rebels into the military and, with French assistance, help others return to a productive civilian life. Second military regime and third military regime (1996–1999) The governmental paralysis prompted the military to intervene; on 27 January 1996, Col. Ibrahim Baré Maïnassara led a coup that deposed President Ousmane and ended the Third Republic. Maïnassara headed a Conseil de Salut National (National Salvation Council) composed of military official which carried out a six-month transition period, during which a new constitution was drafted and adopted on 12 May 1996. Presidential campaigns were organised in the months that followed. Maïnassara entered the campaign as an independent candidate and won the election on 8 July 1996, however the elections were viewed nationally and internationally as irregular, as the electoral commission was replaced during the campaign. Meanwhile, Maïnassara instigated an IMF and World Bank-approved privatisation programme which enriched many of his supporters but were opposed by the trade unions. Following fraudulent local elections in 1999 the opposition ceased any cooperation with the Maïnassara regime. In unclear circumstance (possibly attempting to flee the country), Maïnassara was assassinated at Niamey Airport on 9 April 1999. Maj. Daouda Malam Wanké then took over, establishing a transitional National Reconciliation Council to oversee the drafting of a constitution with a French-style semi-presidential system. This was adopted on 9 August 1999 and was followed by presidential and legislative elections in October and November of the same year. The elections were generally found to be free and fair by international observers. Wanké then withdrew from governmental affairs. Fifth Republic (1999–2009) After winning the election in November 1999, President Tandja Mamadou was sworn in office on 22 December 1999 as the first president of the Fifth Republic. Mamadou brought about many administrative and economic reforms that had been halted due to the military coups since the Third Republic, as well as helped peacefully resolve a decades-long boundary dispute with Benin. In August 2002, serious unrest within military camps occurred in Niamey, Diffa, and Nguigmi, but the government was able to restore order within several days. On 24 July 2004, the first municipal elections in the history of Niger were held to elect local representatives, previously appointed by the government. These elections were followed by presidential elections, in which Mamadou was re-elected for a second term, thus becoming the first president of the republic to win consecutive elections without being deposed by military coups. The legislative and executive configuration remained quite similar to that of the first term of the president: Hama Amadou was reappointed as prime minister and Mahamane Ousmane, the head of the CDS party, was re-elected as the president of the National Assembly (parliament) by his peers. By 2007, the relationship between President Tandja Mamadou and his prime minister had deteriorated, leading to the replacement of the latter in June 2007 by Seyni Oumarou following a successful vote of no confidence at the Assembly. The | quite similar to that of the first term of the president: Hama Amadou was reappointed as prime minister and Mahamane Ousmane, the head of the CDS party, was re-elected as the president of the National Assembly (parliament) by his peers. By 2007, the relationship between President Tandja Mamadou and his prime minister had deteriorated, leading to the replacement of the latter in June 2007 by Seyni Oumarou following a successful vote of no confidence at the Assembly. The political environment worsened in the following year as President Tandja Mamadou sought out to extend his presidency by modifying the constitution which limited presidential terms in Niger. Proponents of the extended presidency, rallied behind the 'Tazartche' (Hausa for 'overstay') movement, were countered by opponents ('anti-Tazartche') composed of opposition party militants and civil society activists. The situation in the north also deteriorated significantly in this period, resulting in the outbreak of a Second Tuareg Rebellion in 2007 led by the Mouvement des Nigériens pour la justice (MNJ). Despite a number of high-profile kidnappings the rebellion had largely fizzled out inconclusively by 2009. However the poor security situation in the region is thought to have allowed elements of Al-Qaeda in the Islamic Maghreb (AQIM) to gain a foothold in the country. Fourth military regime (2009–2010) In 2009, President Tandja Mamadou decided to organize a constitutional referendum seeking to extend his presidency, which was opposed by other political parties, as well as being against the decision of the Constitutional Court which had ruled that the referendum would be unconstitutional. Mamadou then modified and adopted a new constitution by referendum, which was declared illegal by the Constitutional Court, prompting Mamadou to dissolve the Court and assume emergency powers. The opposition boycotted the referendum and the new constitution was adopted with 92.5% of voters and a 68% turnout, according to official results. The adoption of the new constitution created a Sixth Republic, with a presidential system, as well as the suspension of the 1999 Constitution and a three-year interim government with Tandja Mamadou as president. The events generated severe political and social unrest throughout the country. In a coup d'état in February 2010, a military junta led by captain Salou Djibo was established in response to Tandja's attempted extension of his political term by modifying the constitution. The Supreme Council for the Restoration of Democracy, led by General Salou Djibo, carried out a one-year transition plan, drafted a new constitution and held elections in 2011 that were judged internationally as free and fair. Seventh Republic (2010–present) Following the adoption of a new constitution in 2010 and presidential elections a year later, Mahamadou Issoufou was elected as the first president of the Seventh Republic; he was then re-elected in 2016. The constitution also restored the semi-presidential system which had been abolished a year earlier. An attempted coup against him in 2011 was thwarted and its ringleaders arrested. Issoufou's time in office has been marked by numerous threats to the country's security, stemming from the fallout from the Libyan Civil War and Northern Mali conflict, a rise in attacks by AQIM, the use of Niger as a transit country for migrants (often organised by criminal gangs), and the spillover of Nigeria's Boko Haram insurgency into south-eastern Niger. French and American forces are currently assisting Niger in countering these threats. On 27 December 2020, Nigeriens went to the polls after Issoufou announced he would step down, paving the way to Niger's first ever peaceful transition of power. However, no candidate won an absolute majority in the vote: Mohamed Bazoum came closest with 39.33%. As per the constitution, a run-off election was held on 20 February 2021, with Bazoum taking 55.75% of the vote and opposition candidate (and former president) Mahamane Ousmane taking 44.25%, according to the electoral commission. On 31 March 2021, Niger's security forces thwarted an attempted coup by a military unit in the capital, Niamey. Heavy gunfire was heard in the early hours near the country's presidential palace. The attack took place just two days before newly elected president, Mohamed Bazoum, was due to be sworn into office. The Presidential Guard arrested several people during the incident. On 2 April 2021, Bazoum was sworn in as the President of Niger, meaning the country's first democratic transition of power since independence in 1960. Geography, climate, and ecology Niger is a landlocked nation in West Africa located along the border between the Sahara and Sub-Saharan regions. It borders Nigeria and Benin to the south, Burkina Faso and Mali to the west, Algeria and Libya to the north and Chad to the east. Niger lies between latitudes 11° and 24°N, and longitudes 0° and 16°E. Niger's area is of which is water. This makes it slightly less than twice the size of France, and the world's twenty-second largest country. Niger borders seven countries and has a total perimeter of . The longest border is with Nigeria to the south (). This is followed by Chad to the east, at , Algeria to the north-northwest (), and Mali at . Niger also has small borders in its far southwest with Burkina Faso at and Benin at and to the north-northeast Libya at . The lowest point is the Niger River, with an elevation of . The highest point is Mont Idoukal-n-Taghès in the Aïr Mountains at . Climate Niger's climate is mainly very hot and dry, with much desert area, which causes frequent fires in some regions of the country. In the extreme south there is a tropical climate on the edges of the Niger River basin. The terrain is predominantly desert plains and sand dunes, with flat to rolling savanna in the south and hills in the north. Environment The territory of Niger contains five terrestrial ecoregions: Sahelian Acacia savanna, West Sudanian savanna, Lake Chad flooded savanna, South Saharan steppe and woodlands, and West Saharan montane xeric woodlands. The north of Niger is covered by large deserts and semi deserts. The typical mammal fauna consists of addax antelopes, scimitar-horned oryx, gazelles, and in the mountains, Barbary sheep. One of the largest reserves of the world, the Aïr and Ténéré National Nature Reserve, was founded in the northern parts of the Niger to protect these rare species. The southern parts of Niger are naturally dominated savannahs. The W National Park, situated in the bordering area to Burkina Faso and Benin, belongs to one of the most important areas for wildlife in Western Africa, which is called the WAP (W–Arli–Pendjari) Complex. It has the most important population of the rare West African lion and one of the last populations of the Northwest African cheetah. Other wildlife includes elephants, buffaloes, roan antelopes, kob antelopes and warthogs. The West African giraffe is currently not found in the W National Park, but further north in Niger, where it has its last relict population. Environmental issues in Niger include destructive farming practices as a result of population pressure. Illegal hunting, bush fires in some areas and human encroachment upon the flood plains of the Niger River for paddy cultivation are environmental issues. Dams constructed on the Niger River in the neighboring countries of Mali and Guinea and also within Niger itself are also cited as a reason for a reduction of water flow in the Niger River—which has a direct effect upon the environment. A lack of adequate staff to guard wildlife in the parks and reserves is another factor cited for loss of wildlife. Farmer-managed natural regeneration is practiced since 1983 to increase food and timber production, and resilience to climate extremes. Governance and politics Niger's new constitution was approved on 31 October 2010. It restored the semi-presidential system of government of the 1999 constitution (Fifth Republic) in which the president of the republic, elected by universal suffrage for a five-year term, and a prime minister named by the president share executive power. As a reflection of Niger's increasing population, the unicameral National Assembly was expanded in 2004 to 113 deputies elected for a five-year term under a majority system of representation. Political parties must attain at least 5 percent of the vote in order to gain a seat in the legislature. The constitution also provides for the popular election of municipal and local officials, and the first-ever successful municipal elections took place on 24 July 2004. The National Assembly passed in June 2002 a series of decentralization bills. As a first step, administrative powers will be distributed among 265 communes (local councils); in later stages, regions and departments will be established as decentralized entities. A new electoral code was adopted to reflect the decentralization context. The country is currently divided into 8 regions, which are subdivided into 36 districts (departments). The chief administrator (governor) in each department is appointed by the government and functions primarily as the local agent of the central authorities. On 26 May 2009, President Tandja dissolved parliament after the country's constitutional court ruled against plans to hold a referendum on whether to allow him a third term in office. According to the constitution, a new parliament was elected within three months. This began a political struggle between Tandja, trying to extend his term-limited authority beyond 2009 through the establishment of a Sixth Republic, and his opponents who demanded that he step down at the end of his second term in December 2009. See 2009 Nigerien constitutional crisis. The military took over the country and President Tandja was put in prison, charged with corruption. The military kept their promise to return the country to democratic civilian rule. A constitutional referendum and national elections were held. A presidential election was held on 31 January 2011, but as no clear winner emerged, run-off elections were held on 12 March 2011. Mahamadou Issoufou of the Nigerien Party for Democracy and Socialism was elected president. A parliamentary election was held at the same time. Foreign relations Niger pursues a moderate foreign policy and maintains friendly relations with the West and the Islamic world as well as non-aligned countries. It belongs to the UN and its main specialized agencies and in 1980–81 served on the UN Security Council. Niger maintains a special relationship with former colonial power France and has close relations with its West African neighbors. It is a charter member of the African Union and the West African Monetary Union and also belongs to the Niger Basin Authority and Lake Chad Basin Commission, the Economic Community of West African States, the Non-Aligned Movement, the Organisation of Islamic Cooperation and the Organization for the Harmonization of Business Law in Africa (OHADA). The westernmost regions of Niger are joined with contiguous regions of Mali and Burkina Faso under the Liptako-Gourma Authority. The border dispute with Benin, inherited from colonial times and concerning inter alia Lété Island in the Niger River, was solved by the International Court of Justice in 2005 to Niger's advantage. Military The Niger Armed Forces (Forces armées nigériennes) are the military and paramilitary forces of Niger, under the president as supreme commander. They consist of the Niger Army (Armée de Terre), the Niger Air Force (Armée de l'Air) and the auxiliary paramilitary forces, such as the National Gendarmerie (Gendarmerie nationale) and the National Guard (Garde Nationale). Both paramilitary forces are trained in military fashion and have some military responsibilities in wartime. In peace time their duties are mostly policing duties. The armed forces are composed of approximately 12,900 personnel, including 3,700 gendarmes, 3200 national guards, 300 air force personnel, and 6,000 army personnel. The armed forces of Niger have been involved several military coups over the years with the most recent in 2010. Niger's armed forces have a long history of military cooperation with France and the United States. , Niamey is home to a U.S. drone base. Judicial system The current Judiciary of Niger was established with the creation of the Fourth Republic in 1999. The constitution of December 1992 was revised by national referendum on 12 May 1996 and, again, by referendum, revised to the current version on 18 July 1999. It is based on the Code Napoleon "Inquisitorial system", established in Niger during French colonial rule and the 1960 Constitution of Niger. The Court of Appeals reviews questions of fact and law, while the Supreme Court reviews application of the law and constitutional questions. The High Court of Justice (HCJ) deals with cases involving senior government officials. The justice system also includes civil criminal courts, customary courts, traditional mediation, and a military court. The military court provides the same rights as civil criminal courts; however, customary courts do not. The military court cannot try civilians. Law enforcement Law enforcement in Niger is the responsibility of the Ministry of Defense through the National Gendarmerie and the Ministry of the Interior through the National Police and the National Guard. The National Police is primarily responsible for law enforcement in urban areas. Outside big cities and in rural areas, this responsibility falls on the National Gendarmerie and the National Guard. Government finance Government finance is derived revenue exports (Mining, oil and agricultural exports) as well as various forms of taxes collected by the government. In the past, foreign aid has contributed to large percentages of the budget. In 2013, Niger's government has adopted a zero-deficit budget of 1.279 trillion CFA francs ($2.53 billion) which is claimed to balance revenues and expenditures by an 11% reduction in the budget from the previous year. The 2014 budget was 1.867 trillion CFA which is distributed as follows according to: public debt (76,703,692,000 CFA), personnel expenditures (210,979,633,960 CFA), operating expenditures (128,988,777,711 CFA); subsidies and transfers: 308,379,641,366 CFA) and Investment (1,142,513,658,712 CFA). Foreign aid The importance of external support for Niger's development is demonstrated by the fact that about 45% of the government's FY 2002 budget, including 80% of its capital budget, derives from donor resources. The most important donors in Niger are France, the European Union, the World Bank, the International Monetary Fund, and various United Nations agencies (UNDP, UNICEF, FAO, World Food Program, and United Nations Population Fund). Other principal donors include the United States, Belgium, Germany, Switzerland, Canada, and Saudi Arabia. While USAID does not have an office in Niger, the United States is a major donor, contributing nearly $10 million each year to Niger's development. The U.S. also is a major partner in policy coordination in such areas as food security and HIV/AIDS. Administrative divisions Niger is divided into 7 Regions and one capital district. These Regions are subdivided into 36 departments. The 36 Departments are currently broken down into Communes of varying types. there were 265 communes, including communes urbaines (Urban Communes: as subdivisions of major cities), communes rurales (Rural Communes), in sparsely populated areas and postes administratifs (Administrative Posts) for largely uninhabited desert areas or military zones. Rural communes may contain official villages and settlements, while Urban Communes are divided into quarters. Niger subvisions were renamed in 2002, in the implementation of a decentralisation project, first begun in 1998. Previously, Niger was divided into 7 Departments, 36 Arrondissements, and Communes. These subdivisions were administered by officials appointed by the national government. These offices will be replaced in the future by democratically elected councils at each level. The pre-2002 departments (renamed as regions) and capital district are: Agadez Region Diffa Region Dosso Region Maradi Region Tahoua Region Tillabéri Region Zinder Region Niamey Largest cities and towns Economy The economy of Niger centers on subsistence crops, livestock, and some of the world's largest uranium deposits. Drought cycles, desertification, a 2.9% population growth rate, and the drop in world demand for uranium have undercut the economy. Niger shares a common currency, the CFA franc, and a common central bank, the Central Bank of West African States (BCEAO), |
maintenance of traditional social structures and the retention of close economic ties with France. He was re-elected unopposed in 1965 and 1970. Diori gained worldwide respect for his role as a spokesman for African affairs and as a popular arbitrator in conflicts involving other African nations. Domestically, however, his administration was rife with corruption, and the government was unable to implement much-needed reforms or to alleviate the widespread famine brought on by the Sahelian drought of the early 1970s. Increasingly criticized at home for his negligence in domestic matters, Diori put down a coup in 1963 and narrowly escaped assassination in 1965. Faced with an attempted military coup and attacks by members of Sawaba, he used French advisers and troops to counter threats to his rule, despite student and union protests against what they perceived French neocolonialism. However, his relationship with France suffered when his government voiced dissatisfaction with the level of investment in uranium production when French President Georges Pompidou visited Niger in 1972. The PPN functioned as a platform for a handful of Politburo leaders grouped around Diori and his advisors Boubou Hama and Diamballa Maiga, who were largely unchanged from their first election in 1956. By 1974 the party had not held a congress since 1959 (one was scheduled for late 1974 during the famine induced political crisis, but never held). The PPN election lists were made up of traditional rulers from the main ethnic regions who, upon election to the Assembly, were given only ceremonial power. Ethnic tensions, too, mounted during Diori's regime. The Politburo and successive cabinets were made up almost exclusively of Djerma, Songhai and Maouri ethnic groups from the west of the country, the same ethnic base the French had relied on during colonial rule. No Politburo ever contained a member of Hausa or Fula groups, even though the Hausa were the plurality of the population, forming over 40% of Nigeriens. Widespread civil disorder followed allegations that some government ministers were misappropriating stocks of food aid and accused Diori of consolidating power. Diori limited cabinet appointments to fellow Djerma, family members, and close friends. In addition, he acquired new powers by declaring himself the minister of foreign and defense affairs. 1974 to 1990 On 15 April 1974, Lieutenant colonel Seyni Kountché led a military coup that ended Diori's rule. Diori was imprisoned until 1980 and remained under house arrest. The government that followed, while plagued by coup attempts of its own, survived until 1993. While a period of relative prosperity, the military government of the period allowed little free expression and engaged in arbitrary imprisonment and killing. The first presidential elections took place in 1993 (33 years after independence), and the first municipal elections only took place in 2007. Upon Kountché's death in 1987, he was succeeded by his Chief of Staff and cousin, Col. Ali Saibou. Saibou liberalized some of Niger's laws and policies, and promulgated a new constitution. He released political prisoners, including Diori and his old political nemesis Djibo Bakary. However, President Saibou's efforts to control political reforms failed in the face of union and student demands to institute a multi-party democratic system. The Saibou regime acquiesced to these demands by the end of 1990. New political parties and civic associations sprang up, and a National Conference was convened in July 1991 to prepare the way for the adoption of a new constitution and the holding of free and fair elections. The debate was often contentious and accusatory, but under the leadership of Prof. André Salifou, the conference developed consensus on the modalities of a transitional government. 1990s A transitional government was installed in November 1991 to manage the affairs of state until the institutions of the Third Republic were put in place in April 1993. While the economy deteriorated over the course of the transition, certain accomplishments stand out, including the successful conduct of a constitutional referendum; the adoption of key legislation such as the electoral and rural codes; and the holding of several free, fair, and nonviolent nationwide elections. Freedom of the press flourished with the appearance of several new independent newspapers. In 1993, Mahamane Ousmane, the Democratic and Social Convention (CDS) party candidate, won the presidential election with the support of a coalition of parties. The agreement between the parties fell apart in 1994 leading to governmental paralysis as the CDS on its own no longer had a majority in the assembly. Ousmane dissolved the legislature and called new legislative elections, but the National Movement for the Development of Society (MNSD) party won the largest group of seats, so Ousmane was compelled to appoint Hama Amadou of the MNSD as prime minister. The prime minister then prepared for a surprise attack. Since 1990 Tuareg and Toubou groups that had been leading the Tuareg Rebellion claiming they lacked attention and resources from the central government. As the culmination of an initiative started in 1991, the government signed peace accords in April 1995 with these groups. The government agreed to absorb some former rebels in the military and, with French assistance, help others return to a productive civilian life. The paralysis of government between the President and the Prime Minister who no longer agreed gave Col. Ibrahim Baré Maïnassara a rationale to overthrow the Third Republic and depose the first democratically elected president of Niger, on 27 January 1996. While leading a military authority that ran the government (Conseil de Salut National) during a six-month transition period, Baré enlisted specialists to draft a new constitution for a Fourth Republic announced in May 1996. Baré organized a presidential election in June 1996. He ran against four other candidates, including Ousmane. Before voting had finished, Baré dissolved the national electoral committee and appointed another, which announced him the winner with over 50% of the votes cast. | the 10th century due to its dangers. Imperial Niger Niger was an important economic crossroads, and the empires of Songhai, Mali, Gao, and Kanem-Bornu, as well as a number of Hausa states, claimed control over portions of the area. During recent centuries, the nomadic Tuareg formed large confederations, pushed southward, and, siding with various Hausa states, clashed with the Fulani Empire of Sokoto, which had gained control of much of the Hausa territory in the late 18th century. The area eventually became known as the Bornu Empire, which ended in 1893. Colonization In the 19th century, contact with Europe began when the first European explorers—notably Mungo Park (British) and Heinrich Barth (German)-explored the area searching for the mouth of the Niger River. Although French efforts at pacification began before 1900, dissident ethnic groups, especially the desert Tuareg, were not subdued until 1922, when Niger became a French colony. Niger's colonial history and development parallel that of other French West African territories. France administered her West African colonies through a governor general at Dakar, Senegal, and governors in the individual territories, including Niger. In addition to conferring a limited form of French citizenship on the inhabitants of the territories, the 1946 French constitution provided for decentralization of power and limited participation in political life for local advisory assemblies. Towards independence A further revision in the organization of overseas territories occurred with the passage of the Overseas Reform Act (Loi Cadre) of 23 July 1956, followed by re-organizational measures enacted by the French Parliament early in 1957. In addition to removing voting inequalities, these laws provided for creation of governmental organs, assuring individual territories a measure of self-government over internal matters such as education, health, and infrastructure. After the establishment of the Fifth French Republic on 4 October 1958, the territories of French West Africa and French Equatorial Africa were given the right to hold a referendum on their membership in the French Community, a modified form of the French Union which allowed some limited self-government and was viewed as a path to eventual independence. The 4 December elections (on whether to remain in the French Community, followed shortly by those for the Nigerien territorial assembly) were contested by the two political blocks of the Territorial Assembly. The Nigerien Progressive Party (PPN), originally a regional branch of the African Democratic Rally (RDA), led the Union for the Franco-African Community (UCFA) and was headed by PPN leader and deputy-speaker of the Assembly Hamani Diori. The other block was led by the then majority leader of the Assembly, Djibo Bakary. His Movement Socialist Africain (known by the name Sawaba – independence in the Hausa language) called for a "no" vote: one of only two major formations in French West Africa to do so. While there have always been questions about French influence in the voting The results of both elections were confirmed on the 16th. The PPN led UCFA (yes 358,000) defeated Sawaba (no 98,000), winning 54 seats to 4 in the 60 seat assembly. On the 18th Niger declared itself a republic within the French Community and the Territorial Assembly became the Constituent Assembly. This date (18 December 1958) is celebrated as Republic Day, the national holiday of Niger, and considered the date of the founding of the nation. In March 1959 this became the Legislative Assembly. In 1958 Diori became president of the provisional government, and then became Prime Minister of Niger in 1959. Having organised a powerful coalition of Hausa, Fula, and Djerma leaders, especially made up of chiefs and traditional leaders, in support of Niger's "Yes" vote in the 1959 referendum, Diori gained French favor. During the 1959–1960 period, the French government banned all political parties except the PPN, effectively making Niger a one-party state. The Sawaba leaders fled into exile, and the member parties of the UCFA were folded into the PPN. Independence On 11 July 1960 France agreed to Niger becoming fully independent. The French Fifth Republic passed a revision of the French Community allowing membership of independent states. On 28 July the Nigerien Legislative Assembly became the Nigerien National Assembly. Independence was declared on 3 August 1960 under the leadership of Prime Minister Diori. Subsequently, in November 1960 Diori was elected to the new position of President of Niger by the National Assembly. During his presidency, Diori's government favored the maintenance of traditional social structures and the retention of close economic ties with France. He was re-elected unopposed in 1965 and 1970. Diori gained worldwide respect for his role as a spokesman for African affairs and as a popular arbitrator in conflicts involving other African nations. Domestically, however, his administration was rife with corruption, and the government was unable to implement much-needed reforms or to alleviate the widespread famine brought on by the Sahelian drought of the early 1970s. Increasingly criticized at home for his negligence in domestic matters, Diori put down a coup in 1963 and narrowly escaped assassination in 1965. Faced with an attempted military coup and attacks by members of Sawaba, he used French advisers and troops to counter threats to his rule, despite student and union protests against what they perceived French neocolonialism. However, his relationship with France suffered when his government voiced dissatisfaction with the level of investment in uranium production when French President Georges Pompidou visited Niger in 1972. The PPN functioned as a platform for a handful of Politburo leaders grouped around Diori and his advisors Boubou Hama and Diamballa Maiga, who were largely unchanged from their first election in 1956. By 1974 the party had not held a congress since 1959 (one was scheduled for late 1974 during the famine induced political crisis, but never held). The PPN election lists were made up of traditional rulers from the main ethnic regions who, upon election to the Assembly, were given only ceremonial power. Ethnic tensions, too, mounted during Diori's regime. The Politburo and successive cabinets were made up almost exclusively of Djerma, Songhai and Maouri ethnic groups from the west of the country, the same ethnic base the French had relied on during colonial rule. No Politburo ever contained a member of Hausa or Fula groups, even though the Hausa were the plurality of the population, forming over 40% of Nigeriens. Widespread civil disorder followed allegations that some government ministers were misappropriating stocks of food aid and accused Diori of consolidating power. Diori limited cabinet appointments to fellow Djerma, family members, and close friends. In addition, he acquired new powers by declaring himself the minister of foreign and defense affairs. 1974 to 1990 On 15 April 1974, Lieutenant colonel Seyni Kountché led a military coup that ended Diori's rule. Diori was imprisoned until 1980 and remained under house arrest. The government that followed, while plagued by coup attempts of its own, survived until 1993. While a period of relative prosperity, the military government of the period allowed little free expression and engaged in arbitrary imprisonment and killing. The first presidential elections took place in 1993 (33 years after independence), and the first municipal elections only took place in 2007. Upon Kountché's death in 1987, he was succeeded by his Chief of Staff and cousin, Col. Ali Saibou. Saibou liberalized some of Niger's laws and policies, and promulgated a new constitution. He released political prisoners, including Diori and his old political nemesis Djibo Bakary. However, President Saibou's efforts to control political reforms failed in the face of union and student demands to institute a multi-party democratic system. The Saibou regime acquiesced to these demands by the end of 1990. New political parties and civic associations sprang up, and a National Conference was convened in July 1991 to prepare the way for the adoption of a new constitution and the holding of free and fair elections. The debate was often contentious and accusatory, but under the leadership of Prof. André Salifou, the conference developed consensus on the modalities of a transitional government. 1990s A transitional government was installed in November 1991 to manage the affairs of state until the institutions of the Third Republic were put in place in April 1993. While the economy deteriorated over the course of the transition, certain accomplishments stand out, including the successful conduct of a constitutional referendum; the adoption of key legislation such as the electoral and rural codes; and the holding of several free, fair, and nonviolent nationwide elections. Freedom of the press flourished with the appearance of several new independent newspapers. In 1993, Mahamane Ousmane, the Democratic and Social Convention (CDS) party candidate, won the presidential election with the support of a coalition of parties. The agreement between the parties fell apart in 1994 leading to governmental paralysis as the CDS on its own no longer had a majority in the assembly. Ousmane dissolved the legislature and called new legislative elections, but the National Movement for the Development of Society (MNSD) party won the largest group of seats, so Ousmane was compelled to appoint Hama Amadou of the MNSD as prime minister. The prime minister then prepared for a surprise attack. Since 1990 Tuareg and Toubou groups that had been leading the Tuareg Rebellion claiming they lacked attention and resources from the central government. As the culmination of an initiative started in 1991, the government signed peace accords in April 1995 with these groups. The government agreed to absorb some former rebels in the military and, with French assistance, help others return to a productive civilian life. The paralysis of government between the President and the Prime Minister who no |
followed by fresh elections for a democratic rule, and Mamadou Tandja assumed power in December 1999. Tandja, who won the elections in 2004 and in 2009, wanted to bring about a constitutional amendment to extend his tenure as president. However, in February 2010, he was removed from the post of the president in a coup engineered by the military and the constitution was annulled. Soon after, in 2011, elections were held and Mahamadou Issoufou got elected as the president and was sworn in April 2011. Niger's problem with rebellious groups continued during 2007 and 2008. Rebellion was controlled. However, its security problems with its neighbors such as Libya, Nigeria and Mali have been a cause for concern Geography Niger, with a land area of 1.267 million km2, is a land locked country which is bounded with a land boundary of 5,834 km by seven countries: Algeria (951 km), Benin (277 km), Burkina Faso (622 km), Chad (1,196 km), Libya (342 km), Mali (838 km, and Nigeria (1,608) km. Regions Niger is divided into 7 Regions (French: régions; singularrégion). Each department's capital is the same as its name. The national capital, Niamey, comprises a capital district. Departments The Regions of Niger are subdivided into 63 Departments. Communes The 63 Departments are broken down into Communes. As of 2006 there were 265 communes, including communes urbaines (Urban Communes: centred in or as subdivisions of cities of over 10000), communes rurales (Rural Communes) centred in cities of under 10,000 and/or sparsely populated areas, and a variety of traditional (clan or tribal) bodies amongst semi-nomadic populations. Cities Roadways Physical geography Agricultural geography Some of the land in Niger is used as arable land (660 km2 of land in Niger is irrigated) and as pasture. There are some forests and woodland. The table below describes land use in Niger, as of 2011. Climate Niger's climate is largely hot and dry, with most of the country in a desert region. The terrain is predominantly desert plains and sand dunes. There are also large plains in the south and hills in the north. In the extreme south, there is a tropical climate near the edges of the Niger River Basin. Lake Chad at the southeast corner of the country is shared between Niger, Nigeria, Chad, and Cameroon. Current issues Current environmental issues in Niger include overgrazing, soil erosion, deforestation, desertification, recurring droughts, and endangered wildlife populations (such as the African elephant, Northwest African cheetah, West African giraffe, and Addax), which are threatened because of poaching and habitat destruction. Natural hazards Recurring droughts are a serious challenge for Niger. The 2012 Sahel drought, along with failed crops, insect plagues, high food prices and conflicts is | and/or sparsely populated areas, and a variety of traditional (clan or tribal) bodies amongst semi-nomadic populations. Cities Roadways Physical geography Agricultural geography Some of the land in Niger is used as arable land (660 km2 of land in Niger is irrigated) and as pasture. There are some forests and woodland. The table below describes land use in Niger, as of 2011. Climate Niger's climate is largely hot and dry, with most of the country in a desert region. The terrain is predominantly desert plains and sand dunes. There are also large plains in the south and hills in the north. In the extreme south, there is a tropical climate near the edges of the Niger River Basin. Lake Chad at the southeast corner of the country is shared between Niger, Nigeria, Chad, and Cameroon. Current issues Current environmental issues in Niger include overgrazing, soil erosion, deforestation, desertification, recurring droughts, and endangered wildlife populations (such as the African elephant, Northwest African cheetah, West African giraffe, and Addax), which are threatened because of poaching and habitat destruction. Natural hazards Recurring droughts are a serious challenge for Niger. The 2012 Sahel drought, along with failed crops, insect plagues, high food prices and conflicts is currently affecting Niger causing a hunger crisis. Many families in Niger, still recovering from the 2010 Sahel famine, are being affected by the 2012 Sahel drought. The 2005–06 Niger food crisis created a severe, but localized food security crisis in the |
due to Niger being a former colony of France. Niger's high infant mortality rate is comparable to levels recorded in neighboring countries. However, the child mortality rate (deaths among children between the ages of 1 and 4) is exceptionally high (274 per 1,000) due to generally poor health conditions and inadequate nutrition for most of the country's children. Niger's very high total fertility rate (6.89 children born per woman, which is the highest in the world), nonetheless, means that nearly half (49%) of the Nigerien population is under age 15. School attendance is low (34%), including 38% of males and 27% of females. Additional education occurs through Koranic schools. Population Source: Institut National de la Statistique - Niger Census results UN estimates According to the total population was in , compared to only 2 462 000 in 1950. The proportion of children and teenagers below the age of 15 in 2010 was 49%, 48.8% was between 15 and 65 years of age, while only 2.2% was 65 years or older. Life expectancy Vital statistics Registration of vital events in Niger is incomplete. The Population Departement of the United Nations prepared the following estimates. Fertility and births Total fertility rate (TFR; Wanted Fertility Rate) and crude birth rate (CBR): Fertility data as of 2012 (DHS Program): Ethnic groups Core health indicators Other demographic statistics Demographic statistics according to the World Population Review in 2019. One birth every 29 seconds One death every 3 minutes One net migrant every 90 minutes Net gain of one person every 36 seconds The following demographic are from the CIA World Factbook unless otherwise indicated. Population 19,866,231 (July 2018 est.) Age structure 0-14 years: 48.68% (male 4,878,031 /female 4,793,021) 15-24 years: 19.36% (male 1,899,879 /female 1,945,806) 25-54 years: 26.02% (male 2,581,597 /female 2,587,913) 55-64 years: 3.3% (male 340,032 /female 315,142) 65 years and over: 2.64% (male 268,072 /female 256,738) (2018 est.) Birth rate 43.6 births/1,000 population (2018 est.) Country comparison to the world: 2nd Death rate 11.5 deaths/1,000 population (2018 est.) Total fertility rate 6.35 children born/woman (2018 est.) Country comparison to the world: 1st Median age total: 15.5 years. Country comparison to the world: 228th male: 15.4 years female: 15.7 years (2018 est.) Population growth rate 3.16% (2018 est.) Country comparison to the world: 7th Mother's mean age at first birth 18.1 years (2012 est.) note: median age at first birth among women 25-29 Contraceptive prevalence rate 18.9% (2017) | rate 43.6 births/1,000 population (2018 est.) Country comparison to the world: 2nd Death rate 11.5 deaths/1,000 population (2018 est.) Total fertility rate 6.35 children born/woman (2018 est.) Country comparison to the world: 1st Median age total: 15.5 years. Country comparison to the world: 228th male: 15.4 years female: 15.7 years (2018 est.) Population growth rate 3.16% (2018 est.) Country comparison to the world: 7th Mother's mean age at first birth 18.1 years (2012 est.) note: median age at first birth among women 25-29 Contraceptive prevalence rate 18.9% (2017) Net migration rate -0.5 migrant(s)/1,000 population (2017 est.) Country comparison to the world: 125th Dependency ratios total dependency ratio: 111.6 (2015 est.) youth dependency ratio: 106.2 (2015 est.) elderly dependency ratio: 5.4 (2015 est.) potential support ratio: 18.6 (2015 est.) Urbanization urban population: 16.4% of total population (2018) rate of urbanization: 4.27% annual rate of change (2015-20 est.) Sex ratio at birth: 1.03 male(s)/female younger than 15 years: 1.02 male(s)/female 15–64 years: 0.99 male(s)/female 65 years and over: 0.8 male(s)/female total population: 1 male(s)/female (2010 est.) Life expectancy at birth total population: 56.3 years (2018 est.) male: 55 years (2018 est.) female: 57.7 years (2018 est.) total population: 52.6 years male: 51.39 years female: 53.85 years (2010 est.) Nationality noun: Nigerien(s) adjective: Nigerien Ethnic Groups Hausa 53.1% Zarma/Songhai 21.2% Tuareg 11% Fulani (; ) 6.5% Kanuri 5.9% Gurma 0.8% Arab 0.4% Tubu 0.4% Other/Unavailable 0.9% (2006 est.) Religions Muslim 99.3%, Christian 0.3%, animist 0.2%, none 0.1% (2012 est.) Languages French (official) Hausa Zarma (Djerma) Literacy definition: age 15 and over can read and write (2015 est.) total population: 19.1% (2015 est.) male: 27.3% (2015 est.) female: 11% (2015 est.) Total population: 28.7% (2004 est.; source: UNDP 2006; NB- this figure is given without reference to which languages are considered) Male: 42.9% Female: 15.1% School life expectancy (primary to tertiary education) total: 6 |
in November 2002, serving in that position until December 2004. 2004 elections While Tandja easily retained the presidency against a second round challenge by Mahamadou Issoufou, the 2004 National Assembly elections were closer. The PNDS formed a coalition to contest the expanded 113 seats of the National Assembly, which also included the UNI (2 seats), the PPN (2), and the PNA-Al'ouma (4). With the PNDS' 17 seats this coalition took 25 seats. The MNSD remained the largest party at 47 seats, be relied again on CDS-Rahama's 22 seats to govern. A minor portfolios in the Council of Ministers were given to two smaller parties as well, the RDP-Jama'a (6 seats) and ANDP-Zaman Lahiya (5 seats). RSD-Gaskiya (7 seats) and PSDN-Alheri (1 seat) remained aloof of both blocs. 2007 PM crisis In December 2004 Hama Amadou was again chosen as Prime Minister. Mahamane Ousmane, the head of the CDS, was re-elected President of the National Assembly. The new second term government of the Fifth Republic took office on 30 December 2004. In June 2007, a no confidence vote against the government led to the fall of the Prime Minister Hama Amadou and his ministers. Amadou was replaced by Seyni Oumarou, also of the president's MNSD-Nassara party, leading to infighting within a portion of the party still loyal to Amadou. Broad changes were made to the Council of Ministers of Niger, with MNSD-Nassara continuing to take the majority of portfolios, but with the CDS, RDP-Jama'a, and NDP-Zaman Lahiya retaining Ministerial appointments. Tazarce In the run up to the 2009 elections (Presidential, Assembly, and Municipal), a movement to draft President Tandja for a third term appeared. Led by public figures of the MNSD outside government, the group took the name of Tandja's 2004 re-election slogan, Tazarce: a Hausa word meaning "Continuity". Through several well funded and well attended public rallies in late 2008, the President remained silent on the calls for him to remain. The 1999 constitution made the serving of more than two term impossible (article 36), and the revision of that article illegal by any means (article 136). The Prime Minister Seyni Oumarou reiterated on 22 January that all scheduled elections would go ahead before the end of 2009. In March, during his meetings with French President Sarkozy, Tandja explicitly stated that he would not seek a third term. Then, in early May 2009, when questioned by the press on his visit to Agadez to begin peace talks with Tuareg rebels, Tandja announced that "the people have demanded I remain." His spokesman then outlined a plan in which a referendum could be held in mid-2009, not to amend the | would not seek a third term. Then, in early May 2009, when questioned by the press on his visit to Agadez to begin peace talks with Tuareg rebels, Tandja announced that "the people have demanded I remain." His spokesman then outlined a plan in which a referendum could be held in mid-2009, not to amend the 1999 constitution, but to scrap it and begin work on a constitution of the Sixth Republic of Niger, which would contain no term limits for the President, and create a fully Presidential republic. On 15 May 2009, in response to their parties opposition to a proposed referendum to allow the President to seek a third term, the three members of RDP-Jama'a and ANDP-Zaman Lahiya were replaced with ministers drawn from the MNSD-Nassara. With the continued support of the CDS, the MNSD maintained a working majority of 67 seats in the 113 seat National Assembly. According to the 1999 Constitution of Niger, the President may call a referendum on any matter (except for a revision of those elements of the Constitution outlined in Article 136—including the presidential term limits). The Constitutional Court of Niger and the National Assembly of Niger must advise the president, but there is no provision that the president must heed their advice. On 25 May 2009, the Constitutional Court, made up of appointed judges, released a ruling that any referendum to create a new constitution would be unconstitutional, and further would be a violation of the oath the president had taken on the Koran (a serious matter in this overwhelmingly Muslim country). The week prior, two major parties had come out in their opposition to the referendum proposal as well. On 13 May, the ANDP-Zaman Lahiya, led by former MNSD number two Djermokoye declared its opposition to any change in the constitution. On 15 May the CDS-Rahama, the party without which the MNSD could not have formed governments in 1999, 2004, and 2007, came out opposing the referendum, and calling the constitution unalterable. Neither party moved into the opposition, and both Ousmane and Djermokoye said they were willing to negotiate with the president. On 26 March, within hours of the Constitutional courts statement, official media read out a statement that President Tandja had dissolved the National Assembly. Under the 1999 Constitution he is allowed to do once every two years, but he must call parliamentary elections with three months. This would mean the government of Niger would carry out scheduled parliamentary elections in September, two months early, and a referendum on a new constitution before Presidential elections which can take place no later than December, assuming the 1999 constitution is in effect. 2010 Coup On February 19 a group calling itself the Supreme Council for Restoration of Democracy (CSRD) stormed the presidential palace during a meeting and took the president Mamadou Tandja hostage. Colonel Goukoye Abdul Karimou, spokesman for CSRD announced on state television that the country's constitution had been suspended and all state institutions dissolved. It is believed that the president is being held in a garrison in the capital city with his resignation being sought. Political parties Constitution The constitution of December 1992 was revised by national referendum on 12 May 1996 and, again, by referendum, revised to the current version on 18 July 1999. It restored the semi-presidential system of government of the December 1992 constitution (Third Republic) in which the president of the republic, elected by universal suffrage for a five-year term, and a prime minister named by the president share executive power. As a reflection of Niger's increasing population, the unicameral National Assembly was expanded in 2004 to 113 deputies elected for a 5-year term under a majority system of representation. Political parties must attain at least 5% of the vote in order to gain a seat in the legislature. Executive branch |President |Mohamed Bazoum |Party for Democracy and Socialism |2 April 2021 |- |Prime Minister |Ouhoumoudou Mahamadou |Party for Democracy and Socialism |3 April 2021 |} Niger's new constitution restores the semi-presidential system of government of the December 1992 constitution (Third Republic) in which the President of the Republic is elected by universal suffrage for a five-year term, and a |
est.) Household income or consumption by percentage share: lowest 10%: 3% highest 10%: 29.3% (1992) Inflation rate (consumer prices): 2.4% (2017 est.) Labour force: 6.5 million (2017 est.) Labour force – by occupation: agriculture 79.2% , industry: 3.3%, services: 17.5% (2012 est.) Unemployment rate: 0.3% (2017 est.) Budget: revenues: $1.757 billion (2017 est.) expenditures: 2.171 billion (2017 est.) Industries: uranium mining, cement, brick, textiles, food processing, chemicals, slaughterhouses Industrial production growth rate: 6% (2017 est.) electrification: total population: 15% (2013) electrification: urban areas: 62% (2013) electrification: rural areas: 4% (2013) Electricity – production: 494.7 million kWh (2016 est.) Electricity – production by source: fossil fuel: 95% renewable: 5% nuclear: 0% other: 0% (2017) Electricity – consumption: 1.065 billion kWh (2016 est.) Electricity – exports: 0 kWh (2016 est.) Electricity – imports: 779 million kWh (2016 est.) Agriculture – products: cowpeas, cotton, peanuts, pearl millet, sorghum, cassava (tapioca), rice; cattle, sheep, goats, camels, donkeys, horses, poultry Exports: $4.143 billion (2017 est.) Exports – commodities: uranium ore, livestock, cowpeas, onions Exports – partners: France 30.2%, Thailand 18.3%, Malaysia 9.9%, Nigeria 8.3%, Mali 5%, Switzerland 4.9% (2017) Imports: $1.829 billion (2017 est.) Imports – commodities: foodstuffs, machinery, vehicles and parts, petroleum, cereals Imports – partners: France 28.8%, China 14.4%, Malaysia 5.7%, Nigeria 5.4%, Thailand 5.3%, US 5.1%, India 4.9% (2017) Debt – external: $3.728 billion (31 December 2017 est.) Economic aid – recipient: $222 million (1995) Currency: 1 Communauté Financière Africaine franc (CFAF) = 100 centimes Exchange rates: Communauté Financière Africaine francs (CFAF) per US$1 – 670 (January 2000), 560.01 (January 1999), 589.95 (1998), 583.67 (1997), 511.55 (1996), 499.15 (1995) note: since 1 January 1999, the CFAF is pegged to the euro at a rate of 655.957 CFA francs per euro Fiscal year: calendar year Economic sectors The economy of Niger centers on subsistence crops, livestock, and some of the world's largest uranium deposits. Drought cycles, desertification, a 2.9% population growth rate, and the drop in world demand for uranium have undercut the economy. Niger shares a common currency, the CFA franc, and a common central bank, the Central Bank of West African States (BCEAO), with seven other members of the West African Monetary Union. Niger is also a member of the Organization for the Harmonization of Business Law in Africa (OHADA). In December 2000, Niger qualified for enhanced debt relief under the International Monetary Fund program for Heavily Indebted Poor Countries (HIPC) and concluded an agreement with the Fund for Poverty Reduction and Growth Facility (PRGF). Debt relief provided under the enhanced HIPC initiative significantly reduces Niger's annual debt service obligations, freeing funds for expenditures on basic health care, primary education, HIV/AIDS prevention, rural infrastructure, and other programs geared at poverty reduction. In December 2005, it was announced that Niger had received 100% multilateral debt relief from the IMF, which translates into the forgiveness of approximately US$86 million in debts to the IMF, excluding the remaining assistance under HIPC. Nearly half of the government's budget is derived from foreign donor resources. Future growth may be sustained by exploitation of oil, gold, coal, and other mineral resources. Uranium prices have recovered somewhat in the last few years. A drought and locust infestation in 2005 led to food shortages for as many as 2.5 million Nigeriens. Economic sectors Agriculture The agricultural economy is based largely upon internal markets, subsistence agriculture, and the export of raw commodities: foodstuffs and cattle to neighbors. Foreign exchange earnings from livestock, although difficult to quantify, are considered the second source of export revenue behind mining and oil exports. Actual exports far exceed official statistics, which often fail to detect large herds of animals informally crossing into Nigeria. Some hides and skins are exported, and some are transformed into handicrafts. Niger's agricultural and livestock sectors are the mainstay of all but 18% of the population. 14% of Niger's GDP is generated by livestock production (camels, goats, sheep and cattle), said to support 29% of the population. Thus 53% of the population is actively involved in crop production. The 15% of Niger's land that is arable is found mainly along its southern border with Nigeria. In these areas, Pearl millet, sorghum, and cassava are the principal rain-fed subsistence crops. Irrigated rice for internal consumption is grown in parts of the Niger River valley in the west. While expensive, it has, since the devaluation of the CFA franc, sold for below the price of imported rice, encouraging additional production. Cowpeas and onions are grown for commercial export, as are small quantities of garlic, peppers, potatoes, and wheat. Oasis farming in small patches of the north of the country produces onions, dates, and some market vegetables for export. But for the most part, rural residents engaged in crop tending are clustered in the south centre and south west of the nation, in those areas (the Sahel) which can expect to receive between of rainfall annually. A small area in the southern tip of the nation, surrounding Gaya can expect to receive or rainfall. Northern areas which support crops, such as the southern portions of the Aïr Massif and the Kaouar oasis, rely upon oases and a slight increase in rainfall due to mountain effects. Large portions of the northwest and far east of the nation, while within the Sahara desert, see just enough seasonal rainfall to support semi-nomadic animal husbandry. The populations of these areas, mostly Tuareg, Wodaabe – Fula, and Toubou, travel south (a process called transhumance) to pasture and sell animals in the dry season, north into the Sahara in the brief rainy season. Rainfall varies and when it is insufficient, Niger has difficulty feeding its population and must rely on grain purchases and food aid to meet food requirements. Rains, as in much of the Sahel, have been marked by annual variability. This has been especially true in the 20th century, with the most severe drought on record beginning in the late 1960s and lasting, with one break, well into the 1980s. The long-term effect of this, especially to pastoralist populations, remains in the 21st century, with those communities which rely upon cattle, sheep, and camels husbandry losing entire herds more than once during this period. Recent rains remain variable. For instance, the rains in 2000 were not good, while those in 2001 were plentiful and well distributed. Soils that have become degraded, for example by intensive cereal production, cover 50 per cent of Niger's land. Laterite soils have a high clay content, which means they have higher Cation Exchange Capacity and water-holding capacity than sandy soils. If laterite soils become degraded, a hard crust can form on the surface, which hinders water infiltration and the emergence of seedlings. It is possible to rehabilitate such soils, using a system called the Bioreclamation of Degraded Lands. This involves using indigenous water-harvesting methods (such as planting pits and trenches), applying animal and plant residues, and planting high-value fruit trees and indigenous vegetable crops that are tolerant of drought conditions. The International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) has employed this system to rehabilitate degraded laterite soils in Niger and increase smallholder farmers' incomes. Trials have demonstrated that a plot can yield an income of around US$100, which is what men traditionally earn from millet production per hectare (10000m²). As women are often given degraded soils, using this practice has helped to improve livelihoods for women in Niger. The Kandadji Dam on the Niger River, whose construction started in August 2008, is expected to improve agricultural production in the Tillaberi Department by providing water for the irrigation of 6,000 hectares initially and of 45,000 hectares by 2034. Drought and food crisis As one of the Sahelian nations in West Africa, Niger has faced several droughts which led to food shortages and, in some cases, famines since its independence in 1963. This includes a series of droughts in the 1970s and 1980s and more recently in 2005–2006 and again in 2010. The existence of widespread famine in 2005–2006 was debated by the government of Niger as well some local NGOs. Mining The Niger mining industry is the main source of national exports, of which uranium is the largest export. Niger has been a uranium exporter since the 1960s and has had substantial export earnings and rapid economic growth during the 1960s and 1970s. The persistent uranium price slump has brought lower revenues for Niger's uranium sector, although it still provides 72% of national export proceeds. When the uranium-led boom ended in the early 1980s the economy stagnated, and new investment since then has been limited. Niger's two uranium mines—SOMAIR's open pit mine and COMINAK's underground mine—are owned by a French-led consortium and operated by French company Orano. , many licences have been sold to other companies from countries such as India, China, Canada and Australia in order to exploit new deposits. In 2013, the government of Niger sought to increase its uranium revenue by subjecting the two mining companies to a 2006 Mining Law. The government argued that the application of the new law will balance an otherwise unfavorable partnership between the government and Areva. The company resisted the application of the new law that it feared would jeopardize the financial health of the companies, citing declining market uranium prices and unfavorable market conditions. In 2014, following nearly a year long negotiation with the government of Niger, Areva agreed to the application of 2006 Mining Law of Niger, which would increase the government's uranium revenues from 5 to 12%. In addition to uranium, exploitable deposits of gold are known to exist in Niger in the region between the Niger River and the border with Burkina Faso. In 2004, the first Nigerien gold ingot was produced from the Samira Hill Gold Mine, in Tera Department. The Samira Hill Gold Mine thus became the first commercial gold production in the country. The reserves at the location were estimated at 10,073,626 tons at an average grade of per ton from which will be recovered over a six-year mine life. Other gold deposits are believed to be in nearby areas known as the "Samira Horizon", which is located between Gotheye and Ouallam. SONICHAR (Société Nigerienne de Charbon) in Tchirozerine (north of Agadez) extracts coal from an open pit and fuels an electricity generating plant that supplies energy to the uranium mines. Based on 2012 reports by the government of Niger, 246016 tons of coal were extracted by SONICHAR in 2011. There are additional coal deposits to the south and west that are of a higher quality and may be exploitable. Substantial deposits of phosphates, coal, iron, limestone, and gypsum have also been found in Niger. Oil The history of oil prospecting and discovery goes back to the independence era with the first discovery of Tintouma oil field in Madama in 1975. It is the Agadem basin that has attracted much attention since 1970 with Texaco and then Esso prospecting in the basin until 1980. Exploration permits on the same basin were held successively by Elf Aquitaine (1980–1985), Esso-Elf (1985–1998), Esso (1998–2002) and Esso-Petronas (2002–2006). While the reserves were estimated at 324 million barrels for oil and 10 billion m3 for gas, Esso-Petronas relinquished the permit because it deemed the quantities too small for production. With the sudden increase in oil price, this assessment was no longer true by 2008. the government transferred the Agadem block rights to CNPC. Niger announced that in exchange for the US$5 billion investment, the Chinese company would build wells, 11 of which would open by 2012, a refinery near Zinder and a pipeline out of the nation. The government estimates the area has reserves of , and is seeking further oil in the Tenere Desert and near Bilma. Niger began producing its first barrels of oil in 2011. Growth rates The economic competitiveness created by the January 1994 devaluation of the Communauté Financière Africaine (CFA) franc contributed to an annual average economic growth of 3.5% throughout the mid-1990s. But the economy stagnated due to the sharp reduction in foreign aid in 1999 (which gradually resumed in 2000) and poor rains in 2000. Reflecting the importance of the agricultural sector, the return of good rains was the primary factor underlying economic growth of 5.1% in 2000, 3.1% in 2001, 6.0% in 2002, and 3.0% in 2003. In recent years, the Government of Niger drafted revisions to the investment code (1997 and 2000), petroleum code (1992), and mining code (1993), all with attractive terms for investors. The present government actively seeks foreign private investment and considers it key to restoring economic growth and development. With the assistance of the United Nations Development Programme (UNDP), it has undertaken a concerted effort to revitalize the private sector. Economic reforms In January 2000, Niger's newly elected government inherited serious financial and economic problems including a virtually empty treasury, past-due salaries (11 months of unpaid salaries) and scholarship payments, increased debt, reduced | is generated by livestock production (camels, goats, sheep and cattle), said to support 29% of the population. The 15% of Niger's land that is arable is found mainly along its southern border with Nigeria. Rainfall varies and when insufficient, Niger has difficulty feeding its population and must rely on grain purchases and food aid to meet food requirements. Although the rains in 2000 were not good, those in 2001 were plentiful and well distributed. Pearl millet, sorghum, and cassava are Niger's principal rain-fed subsistence crops. Irrigated rice for internal consumption, while expensive, has, since the devaluation of the CFA franc, sold for below the price of imported rice, encouraging additional production. Cowpeas and onions are grown for commercial export, as are small quantities of garlic, peppers, potatoes, and wheat. Groundnuts, and to a lesser degree Cotton, introduced by former colonial power France in the 1930s and 1950s respectively, account for most of the world market for Nigerien industrial agriculture. Prior to the mass exploitation of uranium in the early 1970s, groundnut oil was the largest Nigerien export by worth. The majority of Niger's population are rural residents engaged in agriculture, mostly in the south centre and south west of the nation. While these people are dependent on the agricultural market portions of their production and consumption, much of Nigerien farming is subsistence agriculture outside of the marketplace. External trade and investment Of Niger's exports, foreign exchange earnings from livestock, although impossible to quantify, are second only to those from uranium. Actual exports far exceed official statistics, which often fail to detect large herds of animals informally crossing into Nigeria. Some hides and skins are exported and some are transformed into handicrafts. Mining The persistent uranium price slump has brought lower revenues for Niger's uranium sector, although uranium still provides 72% of national export proceeds. The nation enjoyed substantial export earnings and rapid economic growth during the 1960s and 1970s after the opening of two large uranium mines near the northern town of Arlit. When the uranium-led boom ended in the early 1980s, however, the economy stagnated and new investment since then has been limited. Niger's two uranium mines (SOMAIR's open pit mine and COMINAK's underground mine) are owned by a French-led consortium and operated by French interests. Exploitable deposits of gold are known to exist in Niger in the region between the Niger River and the border with Burkina Faso. Substantial deposits of phosphates, coal, iron, limestone, and gypsum also have been found. Numerous foreign companies, including American firms, have taken out exploration licenses for concessions in the gold seam in western Niger, which also contains deposits of other minerals. Several oil companies explored for petroleum since 1992 in the Djado plateau in north-eastern Niger and the Agadem basin, north of Lake Chad but made no discoveries worth developing at the time. In June 2007, however, China National Petroleum Corporation (state-owned by the People's Republic of China) signed a US$5 billion agreement to extract oil in the Agadem block, as well as build a per day oil refinery and a 2,000 km oil pipeline in the country; production is expected to start in 2009. Niger's known coal reserves, with low energy and high ash content, cannot compete against higher quality coal on the world market. However, the parastatal SONICHAR (Société nigérienne de charbon) in Tchirozerine (north of Agadez) extracts coal from an open pit and fuels an electricity generating plant that supplies energy to the uranium mines. Economic growth After the economic competitiveness created by the January 1994 CFA franc devaluation contributed to an annual average economic growth of 3.5% throughout the mid-1990s, the economy stagnated due to the sharp reduction in foreign aid in 1999, which gradually resumed in 2000, and poor rains in 2000. Reflecting the importance of the agricultural sector, the return of good rains was the primary factor underlying a projected growth of 4.5% for 2001. Foreign investment In recent years, the Government of Niger promulgated revisions to the investment code (1997 and 2000), petroleum code (1992), and mining code (1993), all with attractive terms for investors. The present government actively seeks foreign private investment and considers it key to restoring economic growth and development. With the assistance of the United Nations Development Programme (UNDP), it has undertaken a concerted effort to revitalize the sector. Currency Niger shares a common currency, the CFA franc, and a common central bank, the Central Bank of West African States (BCEAO), with six other members of the West African Monetary Union. The Treasury of the Government of France supplements the BCEAO's international reserves in order to maintain a fixed rate of 100 CFA (Communauté Financière Africaine) to the French franc (to the euro as of January 1, 2002). Government restructuring In January 2000, Niger's newly elected government inherited serious financial and economic problems including a virtually empty treasury, past-due salaries (11 months of arrears) and scholarship payments, increased debt, reduced revenue performance, and lower public investment. In December 2000, Niger qualified for enhanced debt relief under the International Monetary Fund program for Highly Indebted Poor Countries and concluded an agreement with the Fund on a Poverty Reduction and Growth Facility (PRGF). In addition to changes in the budgetary process and public finances, the new government has pursued economic restructuring towards the IMF promoted privatization model. This has included the privatization of water distribution and telecommunications and the removal of price protections for petroleum products, allowing prices to be set by world market prices. Further privatizations of public enterprises are in the works. In its effort comply with the IMF's Poverty Reduction and Growth Facility plan, the government also is taking actions to reduce corruption and, as the result of a participatory process encompassing civil society, has devised a Poverty Reduction Strategy Plan that focuses on improving health, primary education, rural infrastructure, and judicial restructuring. Foreign Aid The most important donors in Niger are France, the European Union, the World Bank, the IMF and other United Nations agencies (UNDP, UNICEF, FAO, WFP, and UNFPA). Other principal donors include the United States, Belgium, Germany, Switzerland, Canada, and Saudi Arabia. While USAID does not have an office in Niger, the United States is a major donor, contributing nearly $10 million each year to Niger's development. The U.S. also is a major partner in policy coordination in such areas as food security and HIV/AIDS. The importance of external support for Niger's development is demonstrated by the fact that about 45% of the government's FY 2002 budget, including 80% of its capital budget, derives from donor resources. In 2005 the UN drew attention to the increased need for foreign aid given severe problems with drought and locusts resulting in a famine endangering the lives around a million people. Macro-economic trend The following table shows the main economic indicators in 1980–2020 (with IMF staff estimtates in 2021–2026). Inflation below 5% is in green. The anual unemployment rate is extracted from the World Bank, althought the International Monetary Fund find them unreliable. Statistics GDP: purchasing power parity – $21.86 billion (2017 est.) GDP – real growth rate: 4.9% (2017 est.) GDP – per capita: purchasing power parity – $1,200 (2017 est.) GDP – composition by sector: agriculture: 41.6% industry: 19.5% services: 38.7% (2017) Population below poverty line: 45.4% (2014 est.) Household income or consumption by percentage share: lowest 10%: 3% highest 10%: 29.3% (1992) Inflation rate (consumer prices): 2.4% (2017 est.) Labour force: 6.5 million (2017 est.) Labour force – by occupation: agriculture 79.2% , industry: 3.3%, services: 17.5% (2012 est.) Unemployment rate: 0.3% (2017 est.) Budget: revenues: $1.757 billion (2017 est.) expenditures: 2.171 billion (2017 est.) Industries: uranium mining, cement, brick, textiles, food processing, chemicals, slaughterhouses Industrial production growth rate: 6% (2017 est.) electrification: total population: 15% (2013) electrification: urban areas: 62% (2013) electrification: rural areas: 4% (2013) Electricity – production: 494.7 million kWh (2016 est.) Electricity – production by source: fossil fuel: 95% renewable: 5% nuclear: 0% other: 0% (2017) Electricity – consumption: 1.065 billion kWh (2016 est.) Electricity – exports: 0 kWh (2016 est.) Electricity – imports: 779 million kWh (2016 est.) Agriculture – products: cowpeas, cotton, peanuts, pearl millet, sorghum, cassava (tapioca), rice; |
and promoting democracy. Press freedom "improved considerably" after Mamadou Tandja was ousted as president in 2010. Media offences were decriminalised shortly afterwards. With the passage of the 2010 law protecting journalists from prosecution related to their work and President Issoufou's November 2011 endorsement of the Declaration of Table Mountain statement on press freedom in Africa (the first head of state to sign the statement), the country continues its efforts to improve press freedom. The Declaration of Table Mountain calls for the repeal of criminal defamation and "insult" laws and for moving press freedom higher on the African agenda. Telephones Calling code: +227 International call prefix: 00 Main lines: 100,500 lines in use, 145th in the world (2012); 24,000 lines in use, 186th in the world (2005). Mobile cellular: 5.4 million lines, 107th in the world (2012); 900,000 lines, 139th in the world (2007). Telephone system: inadequate; small system of wire, radio telephone communications, and microwave radio relay links concentrated in the southwestern area of Niger; domestic satellite system with 3 earth stations and 1 planned; combined fixed-line and mobile-cellular teledensity remains only about 30 per 100 persons despite a rapidly increasing cellular subscribership base (2010); United Nations estimates placed telephone subscribers at 0.2 per hundred in 2000, rising to 2.5 per hundred in 2006. Satellite earth stations: 2 Intelsat (1 Atlantic Ocean and 1 Indian Ocean) (2010). Communications cables: Africa Coast to Europe (ACE) via land links between Niger and the Atlantic coast. Internet Top-level domain: .ne, controlled by the parastatal telecom company, SONITEL. Internet users: 230,084 users, 150th in the world; 1.4% of the population, 205th in the world (2012). 115,900 users, 155th in the world (2009); 40,000 users, 173rd in the world (2006). Fixed broadband: 3,596 subscriptions, 166th in the world; less than 0.05% of the population, 185th in the world (2012). Wireless broadband: Unknown (2012). Internet hosts: 454 hosts, 185th in the world (2012); 216 hosts, 176th in the world (2008). IPv4: 20,480 addresses allocated, less | the Internet. Radio and television Radio stations: state-run TV station; 3 private TV stations provide a mix of local and foreign programming (2007); 5 AM, 6 FM, and 4 shortwave stations (2001). Radios: 680,000 (1997); 500,000 (1992). Television stations: state-run TV station; 3 private TV stations provide a mix of local and foreign programming (2007). Television sets: 125,000 (1997); 37,000 (1992). Because literacy levels in the country are low, radio is a key source for news and information. Radio France Internationale (RFI) is available in the capital, Niamey, and in the Maradi and Zinder regions. The BBC World Service broadcasts in the capital (100.4 FM). Press freedom and control The state controls much of the nation's broadcasting, though private radio stations have proliferated. The media regulatory body, the National Observatory on Communication, and the Independent Nigerien Media Observatory for Ethics, a voluntary media watchdog organization, help to maintain the media environment in Niger. The government maintains a 200 million CFA (~$400,000 USD) press support fund, established by law and available to all media, to encourage support for education, information, entertainment, and promoting democracy. Press freedom "improved considerably" after Mamadou Tandja was ousted as president in 2010. Media offences were decriminalised shortly afterwards. With the passage of the 2010 law protecting journalists from prosecution related to their work and President |
ports, Niger does operate a ports authority. Niger relies on the port at Cotonou (Benin), and to a lesser degree Lomé (Togo), and Port Harcourt (Nigeria), as its main route to overseas trade. Abidjan was in the process of regaining Niger's port trade, following the disruption of the Ivorian Civil War, beginning in 1999. Niger operates a Nigerien Ports Authority station, as well as customs and tax offices in a section of Cotonou's port, so that imports and exports can be directly transported between Gaya and the port. French Uranium mines in Arlit, which produce Niger's largest exports by value, travel through this port to France or the world market. Airports The US government estimated there were 27 airports and/or landing strips in Niger as of 2007. Nine (9) of these had paved runways, 18 with unpaved landing strips. ICAO Codes for Niger are prefixed "DR". Of the 9 Airports with paved runways, 2 with paved strips from 2,438 to 3,047 m: Diori Hamani International Airport and Mano Dayak International Airport. These are the only two Nigerien airports with regular international commercial flights. Six of the remainder have strips between 1,524 and 2,437 m, while one is under 914 m. 18 additional airports have unpaved runways 15 of them with strips between 914 and 1,523 m. Major airports (with ICAO code and IATA code) include: DRRM (MFQ) – Maradi Airport – Maradi DRRN (NIM) – Diori Hamani International Airport – Niamey DRRT (THZ) – Tahoua Airport – Tahoua DRZA (AJY) – Mano Dayak International Airport – Agadez South DRZL (RLT) – Arlit Airport – Arlit DRZR (ZND) – Zinder Airport – Zinder DRZF () – Diffa Airport – Diffa DRZD () – Dirkou Airport – Dirkou DRRB (BKN) – Birni N'Konni Airport – Birni N'Konni Other airstrips (with ICAO codes) include: DRRI Bilma DRRC Dogondoutchi DRRD Dosso DRRG Gaya DRZG Goure DRZI Iferouane DRRP La Tapoa DRZM Maine Soroa DRZN N'guigmi DRRU Ouallam DRZT Tanout DRRA Tessaoua DRRE Téra DRRL Tillabery DRRZ Tillia Railway Niger is a user of the Benin and Togo railway lines which carry goods from seaports to the Niger border. Rail lines to Niamey and other points in Niger were proposed during the colonial period, and continue to be discussed. In 2012, a multi-national railway system was proposed to connect Benin, Niger, Burkina Faso and Ivory Coast. Other lines connecting Nigeria to Niger have also been discussed. For example, on 13 August 2013 in Nigeria, the Vice President of Nigeria Namadi Sambo announced that Nigeria is to construct a line into the Republic of Niger. The new track will be an extension the existing branch from Zaria to Kaura-Namoda, which is to be continued via Sokoto to Birnin Kebbi. In the longer term it will extend the line across the border to Niamey, capital of Niger. The existing branch is currently out of commission, but rehabilitation has commenced. In April 2014, Niamey Railway Station was officially inaugurated and construction began for the railway extension connecting Niamey to Cotonou via Parakou (Benin). This railway line is expected to go through Dosso city and Gaya in the territory of Niger before crossing into Benin. The line Niamey–Dosso city is expected to be completed before December 2014. See also Geography of Niger Seasonal migration in Niger Railway stations in Niger Railway stations in Benin References Jolijn Geels. Niger. Bradt UK/ Globe Pequot Press USA (2006) Samuel Decalo. Historical Dictionary of Niger (3rd ed.). Scarecrow Press, Boston & Folkestone, (1997) Abdou Bontianti et Issa Abdou Yonlihinza, La RN 6 : un exemple d’intégration économique sous-régionale et un facteur de désenclavement du Niger, Les Cahiers d’Outre-Mer, 241-242 January–June 2008. Retrieved 13 May | and semi-converted trucks taking passengers and goods. Services are sometimes scheduled from the "Highway stations" ("Gares routières") found in every town, but are more frequently ad hoc: vehicles ply the trade between towns, picking up at stations or anywhere along the route, and departing only when full. Animals pulling wagons and loaded camel trains remain a common sight on Nigerien roads. Motor vehicle regulation Vehicles in Niger are subject to the "Laws of the Road" ("Code de la route") for which the government began a continuing reform in 2004-2006 and is based substantially on French models. Vehicles travel on the right side of the road, and roads use French style signage. Routes Nationale are marked with the traditional French Milestones: a white tablet with a red top, marked with the route number. Vehicles owners must obtain a Registration document ("Carte grise") and Vehicle license plates ("plaques d’immatriculation"), which are of similar manufacture to those in Guinea and Mali. Licence plates usually contain the national code "RN" for international travel. Niger is a signatory of the September 1949 Geneva Convention on Road Traffic, and thus honours International drivers licenses from other signatories. Drivers licenses are regulated through the national Ministry of Transport, but issued by local officials. Drivers must pass a drivers test to qualify. A 2009 enforcement blitz in Niamey resulted in numerous arrests of owners of small motorbikes, common in Nigerien cities. One newspaper reported that most riders believed erroneously that there was no license or regulation required by law for motorbikes under 50cc in engine size, although these had been regulated in law since 2002 but not enforced. Motorbikes are also common means of public transport in some Nigerien cities. These motorcycle "taxis motos", or "kabu kabu", are the primary form of taxis in cities like Zinder, Agadez, and Maradi. In Zinder, a 2009 local newspaper report claimed there were no more than "three to five" automobile taxis operating in a diffuse city which subsequently relies upon the only partially regulated motorcycle taxi sector. Road safety Road accidents have been identified as a major public health concern by the Nigerien government. According to Chékarou Bagoudou, Chief of the Division of Road Safety and Security of the Nigerien Ministry of Transport, there were 4338 officially reported road accidents in 2008, with 7443 victims, of which 616 were killed. With the Nigerien government counting 18949 km of roads in the nation, this comes to one accident for every five kilometers in 2008. Speaking before a National Assembly session, Bagoudou said that the 42.2 billion CFA francs spent on medical costs for road accident victims accounted for around 25% of the 2008 budget of the Nigerien Ministry of Public Health. Transport figures concluded that 70% of road accidents were caused by "human factors", 23% by mechanical faults and 7% by road conditions. Waterways The Niger River is navigable 300 km from Niamey to Gaya on the Benin frontier from mid-December to March. Thereafter a series of falls and rapids render the Niger unnavigable in all seasons. In the navigable stretches, shallows prevent all but the small draft African canoes (Pirogues and Pinnases) from operating in many areas. As there is only one major bridge over the Niger (The Kennedy Bridge in Niamey: the Niger River bridge at Gaya crosses into Benin), car ferries are of crucial importance, especially the crossing at Bac Farie, 40 km north of Niamey on the RN4, and the car ferry at Ayorou. Despite having no ocean or deep draft river ports, Niger does operate a ports authority. Niger relies on the port at Cotonou (Benin), and to a lesser degree Lomé (Togo), and Port Harcourt (Nigeria), as its main route to overseas trade. Abidjan was in the process of regaining Niger's port trade, following the disruption of the Ivorian Civil War, beginning in 1999. Niger operates a Nigerien Ports Authority station, as well as customs and tax offices in a section of Cotonou's port, so that imports and exports can be directly transported between Gaya and the port. French Uranium mines in Arlit, which produce Niger's largest exports by value, travel through this port to France or the world market. Airports The US government estimated there were 27 airports and/or landing strips in Niger as of 2007. Nine (9) of these had paved runways, 18 with unpaved landing strips. |
for Intervention and Security (FNIS) (Forces nigerienne d'internale securite- FNIS) count a combined 3,700 member paramilitary police force. The FNIS, along with some special units of the Gendarmerie, are armed and trained in military fashion, similar to the Internal Troops of the nations of the former Soviet Union. The Gendarmerie has law enforcement jurisdiction outside the Urban Communes of Niger, while the National police patrols towns. Special internal security operations may be carried out by the Military, the FNIS, the Gendarmerie, or whatever forces tasked by the Government of Niger. Domestic conflicts The First Tuareg Rebellion of 1985–1995 From 1985 to 1995, the armed forces of Niger were engaged in armed fights with the Popular Front for the Liberation of Niger (FPLN). An armed attack by FPLN members in Tchin-Tabaradene in 1985 sparked the closing of the borders with Libya and Algeria, and the resettlement of thousands of Tuareg and other nomads away from the area. Failed promises by the government of Ali Saïbou fueled growing Tuareg discontent leading to an attack on a police station in Tchin-Tabaradene in May 1990. The Niger Army violently responded in May 1990, arresting, torturing, and killing several hundred Tuareg civilians in Tchin-Tabaradene, Gharo and In-Gall in what is known as the Tchin-Tabaradene massacre. Tuareg outrage sparked the creation of two armed insurgent groups: the Front for the Liberation of Aïr and Azaouak and the Front for the Liberation of Tamoust and continued armed fights until 1995 when a peace agreement end fighting. The Nigerien Armed Forces has been extensively involved in politics since independence, and has been denounced at several points for broad abrogation of human rights and unlawful detentions and killings. The Second Tuareg Rebellion of 2007–2009 The Nigerien Armed Forces were involved from 2007 to 2009 in an insurgency in the north of the country, labeled the Second Tuareg Rebellion. A previously unknown group, the Mouvement des Nigeriens pour la Justice (MNJ), emerged in February 2007. The predominantly Tuareg group has issued a number of demands, mainly related to development in the north. It has attacked military and other facilities and laid landmines in the north. The resulting insecurity has devastated Niger's tourist industry and deterred investment in mining and oil. The government has labeled the MNJ criminals and traffickers, and refuses to negotiate with the group until it disarms. As of July 2008, some 100 to 160 Nigerien troops have been killed in the ongoing conflict. The second tuareg rebellion ended in 2009 with Peace Talks hosted by Libya. Foreign missions In 1991, Niger sent a 400-man military contingent to join the American-led allied forces against Iraq during the Gulf War. Niger provides a battalion of peace-keeping forces to the UN Mission in Ivory Coast. As of 2003, the FAN had troops deployed in the following foreign missions: ECOMOG: Liberia, Guinée-Bissau; African Union: Burundi (MIOB), Comoros (MIOC), Mali (AFISMA); United Nations: Saudi Arabia (Iraq War), Rwanda (MINURCA), Democratic Republic of Congo (MONUC); Mali (MINUSMA) Defense cooperation Niger defense forces have a long history of military cooperation with neighboring countries in the region, France, the United States, China as well as many other countries. Regional defense cooperation Through ECOWAS and the African Union, Niger defense forces have been involved in multiple missions in the Africa and the West Africa. Niger has been a supporter and volunteered to participate in the African Union future rapid intervention forces. In addition, with the growing threat of Boko Haram, defense forces of Niger, Nigeria, Cameroon and Chad have intensified cooperation to address the trans-border threat of this organization. Counter-terrorism defense cooperation U.S. and France defense cooperation with Niger has intensified post 9/11 as part of the Global War on Terror. The Niger defense forces along with forces from Chad, Mali, Mauritania have become major partners of France and the United States in counter-terrorism efforts in Africa. The counter-terrorism efforts focused mainly on Al-Qaïda affiliated groups in Africa, in particular the Algerian Group for Call and Combat which will later become AQMI. The collapse of the Gaddafi regime, followed with the disbandment of his arsenal in the region, accentuated the precarious situation of many sahelian nations. The Northern Mali conflict and beginning of Operation Serval to free northern Mali of Islamic militant groups solidified the role of Niger in counter-terrorism activities in the region. Following an agreement with the Niger government, the air force base 101 of Niamey became a permanent drone hub for French and U.S. forces since 2013. Drone intelligence gathering activities in Mali and the region were carried out from this base during Serval. Niamey has become the Intelligence gathering pole of French and U.S. forces in the region. Political involvement In 1974 General Seyni Kountché overthrew the first president of Niger Hamani Diori. The military regime that | light armored company, a camel corps, and a number of support units. It was reorganized in 2003 to create the Niger Air Force as a distinct service branch. Training Basic training is carried out at Niamey at the Tondibiah base and at Agadez. Other special training centers include the National Officers Training School (French: Ecole de Formation des Forces Armées Nigériennes or EFOFAN) and The Paramedical Personnel Training School (EPPAN) both based at the Tondibiah base. In addition to training in Niger, army officers also train in France at the Special Military School of Saint-Cyr, in Morocco at The Royal Military Academy of Meknès, in Algeria and the US. With the growing cross-border threats of terrorism in West Africa, the Niger Army has benefited from training exercises with France and the U.S. The Niger Army has participated in the U.S. led Flinlock Exercise which it hosted in 2014. Equipment The army of Niger is poorly equipped with armored vehicles and tanks. With the exception of two armored vehicles purchased from China in 2009, most armoured vehicles are at least 20 years old. The army is however well-stocked with 4x4 Toyota Land Cruisers mounted with various caliber machine guns. Logistically, fuel and water transportation tanks, and ambulances have been recently improved to help in long-distance patrol missions as well as with general increased logistic capacity of the army. Armor Niger Air Force History The predecessor of the Niger Air Force, the Niger National Escadrille (Escadrille Nationale du Niger) was first formed in 1961. It was later restructured into the National Air Wing (Groupement Aerien National) in 1989. Prior to 2003, military armed forces of Niger ( or FAN) were grouped in one branch with one Chief of Staff who oversees both ground forces as well as the National Air Wing. Following an organizational restructuring in 2003, the military armed forces of Niger were structured into two main service branches: Niger Army (French: armée de terre) for all ground military forces and Niger Air Force (Armée de l'air). Each branch was headed by a Chief of Staff answerable to the Joint Chief of Staff of military armed forces. As part of this new structure, the National Air Wing was renamed as Niger Air Force (Armée de l'Air du Niger) on December 17, 2003. The Niger Air Force is led by the Air Force Chief of staff answerable to the Joint Chief and the Defense Minister. Presently, the Chief of Staff is Col. Boulama Issa. Structure Organizationally, the air force is composed a Chief of Staff Office, operation units (French: escadrons), technical units, an infantry company (compagnie de fusiliers) and generalized staff. The Chief of Staff of the Niger Air Force is the colonel Abdoul Kader Amirou (chef d'etat major). Training At the moment, there is no air force special training facilities in Niger. Basic training of Air Force recruits is conducted at Tondibiah base along with recruits of other military service branches. Air force officers, pilots and mechanics are additionally trained in France, US and other North Africa countries like Morocco at Royal Air Force School of Marrakech and Algeria. In addition, local training activities are undertaken with foreign partners (U.S., France etc.) to update skills. In 2014, a logistic company was trained and equipped by the United States with fuel and water trucks, ambulances and 4x4 unarmed vehicles. Aircraft The aircraft inventory of the Niger Air Force is modest though it has increased with new acquisitions beginning in 2008, and further assistance from France and the US. This expansion in capacity is guided by the need for better border patrol following the crisis in Libya and Mali. Current inventory Paramilitary forces There are two paramilitary services branches: (National Gendarmerie of Niger under the Ministry of Defense and the National Guard of Niger) under the Ministry of Interior. Each of these branches are headed by Chief of Staff answerable to the overseeing ministry. National Gendarmerie The National Gendarmerie is commanded by the Superior Commander of the National Gendarmerie. Unlike the National Police and the National Guard, the National Gendarmerie is under the control of the Ministry of Defense of Niger. It is divided between territorial brigades and mobile brigades. In addition to territorial defense and maintaining public order, it provides military and paramilitary justice to other corps of the armed forces and participates to the judicial and the surveillance police activities. It is regarded as an elite force due to its stringent recruitment criteria of all armed forces. Due to increasing cross-border traffic of weapons and drugs, its activities have increased border areas. The national gendarmerie, unlike the Army or the National Guard, has never been directly involved in an attempt to seize or control power by force. National Guard Formerly known as the National Forces of Intervention and Security, the National Guard of Niger is responsible for security in rural areas where the national police is absent. It is overseen by the superior commander of the National Guard who reports to the Ministry of Interior. This body is responsible for: border and territorial surveillance of the country, public safety, maintaining and restoring of order, protecting public buildings and institutions, people and their property, the execution of the administrative police in rural and pastoral areas, management and monitoring of prisons, humanitarian actions in the case of national disaster or crisis and protection of the environment. It is also responsible for providing security to administrative authorities and the diplomatic and consular representations of Niger abroad. National Police The General Directorate of National Police, headquartered in Niamey was until the 1999 Constitution under the command of the Armed Forces and Ministry of Defense. Today, only the National Gendarmerie reports to the Ministry of Defense, with the National Police and its Para-Military Arm—FNIS—moved to the Nigerien Interior Ministry. The National Gendarmerie (modeled on the French Gendarmerie) and the National Forces for Intervention and Security |
Organisation of Islamic Cooperation. Niger belongs to the United Nations and its main specialized agencies and in 1980-81 served on the UN Security Council. The first president of Niger, Hamani Diori, maintained close relations with the west and became internationally prominent in his diplomatic work, seeking to broker resolutions to conflicts in Africa and beyond. He was particularly prominent in his involvement as a negotiator during the Nigerian Civil War. Niger maintains a permanent purpose to the United Nations Headquarters in New York City, at 417 East 50th Street. In 2009, its Ambassador to the United Nations was Ibrahim A. Abani. Status of diplomatic bilateral relations Have diplomatic relations with Niger Unclear (whether they have established diplomatic relations with Niger or not) No relations Bilateral relations Other Niger has only 24 permanent embassies abroad, although more have permanent representation in Niamey, either through national embassies or other representatives. The United Kingdom, for instance, operates its permanent office for relations to Niger from Accra, Ghana, while Niger's permanent representative resides at the Nigerien Embassy in Paris. Many other small or distant nations have no formal diplomatic relations with Niamey except through their respective consulates at the United Nations Headquarters in New York | to conflicts in Africa and beyond. He was particularly prominent in his involvement as a negotiator during the Nigerian Civil War. Niger maintains a permanent purpose to the United Nations Headquarters in New York City, at 417 East 50th Street. In 2009, its Ambassador to the United Nations was Ibrahim A. Abani. Status of diplomatic bilateral relations Have diplomatic relations with Niger Unclear (whether they have established diplomatic relations with Niger or not) No relations Bilateral relations Other Niger has only 24 permanent embassies abroad, although more have permanent representation in Niamey, either through national embassies or other representatives. The United Kingdom, for instance, operates its permanent office for relations to Niger from Accra, Ghana, while Niger's permanent representative resides at the Nigerien Embassy in Paris. Many other small or distant nations have no formal diplomatic relations with Niamey except through their respective consulates at the United Nations Headquarters in New York City. Australia, for instance, only signed the instruments of formal diplomatic relations with Niamey on 7 May 2009, through their respective consular officials at the UN. Border disputes Libya has in the past claimed a strip along their border of about 19,400 km² in northern Niger. There have been several decades of unresolved discussions regarding the delimitation of international boundaries in the vicinity of Lake Chad between Niger, Nigeria, Chad, and Cameroon. The lack of firm borders, as well as the receding of the lake in the 20th century led to border incidents between Cameroon and Chad in the past. An agreement has been completed and awaits ratification by Cameroon, Chad, Niger, and Nigeria. Niger has an ongoing conflict with Benin over Lété Island, an island in the River Niger approx. 16 kilometres long and 4 kilometres wide, located around 40 kilometers from the town of Gao, Niger. Together with other smaller islands in the River Niger, it was the main object of a territorial dispute between Niger and Benin, which had begun when the two entities were still under French rule. The |
elected civilian President of Nigeria heralding the beginning of the Fourth Nigerian Republic. This ended almost 33 years of military rule from 1966 until 1999, excluding the short-lived second republic (between 1979 and 1983) by military dictators who seized power in coups d'état and counter-coups. Although the elections that brought Obasanjo to power and for a second term in the 2003 presidential election were condemned as unfree and unfair, Nigeria has shown marked improvements in attempts to tackle government corruption and hasten development. Ethnic violence for control over the oil-producing Niger Delta region and an insurgency in the northeast are some of the issues facing the country. Umaru Yar'Adua of the People's Democratic Party came into power in the general election of 2007. The international community, which had been observing Nigerian elections to encourage a free and fair process, condemned this one as being severely flawed. President Olusegun Obasanjo acknowledged fraud and other electoral "lapses" but said the result reflected opinion polls. In a national television address in 2007, he added that if Nigerians did not like the victory of his handpicked successor, they would have an opportunity to vote again in four years. Yar'Adua died on May 5, 2010. Goodluck Jonathan was sworn in as Yar'Adua's successor, becoming the 14th head of state. Jonathan went on to win the 2011 presidential election, with the international media reporting the elections as having run smoothly with relatively little violence or voter fraud, in contrast to previous elections. Ahead of the general election of 2015, a merger of the biggest opposition parties – the Action Congress of Nigeria, the Congress for Progressive Change, the All Nigeria Peoples Party (a faction of the All Progressives Grand Alliance), and the new PDP (a faction of serving governors of the ruling People's Democratic Party) – formed the All Progressives Congress. In the 2015 presidential election, former military head of state General Muhammadu Buhari – who had previously contested in the 2003, 2007, and 2011 presidential elections—defeated incumbent Jonathan of the People's Democratic Party by over two million votes, ending the party's sixteen-year rule in the country and marking the first time in the history of Nigeria that an incumbent president lost to an opposition candidate. Observers generally praised the election as being fair. Jonathan was generally praised for conceding defeat and limiting the risk of unrest. In the 2019 presidential election, Buhari was re-elected for a second term in office defeating his closet rival Atiku Abubakar. Politics Nigeria is a federal republic modelled after the United States, with executive power exercised by the President. The president is both head of state and head of the federal government; the president is elected by popular vote to a maximum of two four-year terms. The president's power is checked by a Senate and a House of Representatives, which are combined in a bicameral body called the National Assembly. The Senate is a 109-seat body with three members from each state and one from the capital region of Abuja; members are elected by popular vote to four-year terms. The House contains 360 seats, with the number of seats per state determined by population. Ethnocentrism, tribalism, religious persecution, and prebendalism have plagued Nigerian politics both before and after independence in 1960. All major parties have practised vote-rigging and other means of coercion to remain competitive. In the period before 1983 election, a report prepared by the National Institute of Policy and Strategic Studies showed that only the 1959 and 1979 elections were held without systemic rigging. In 2012, Nigeria was estimated to have lost over $400 billion to corruption since independence. Kin-selective altruism is prevalent in Nigerian politics, resulting in tribalist efforts to concentrate Federal power to a particular region of their interests. Because of the above issues, Nigeria's political parties are pan-national and secular in character (though this does not preclude the continuing preeminence of the dominant ethnicities). The two major political parties are the People's Democratic Party of Nigeria and the All Progressives Congress, with twenty registered minor opposition parties. Hausa-Fulani, Yoruba and Igbo are the three largest ethnic groups in Nigeria and have maintained historical preeminence in Nigerian politics; competition amongst these three groups has fuelled animosity. Following the bloody civil war, nationalism has seen an increase in the southern part of the country leading to active secessionist movements such as the Oodua Peoples Congress and the Movement for the Actualization of the Sovereign State of Biafra, though these groups are generally small and not representative of the entire ethnic group. Law The country has a judicial branch, with the highest court being the Supreme Court of Nigeria. There are three distinct systems of law in Nigeria: Common law, derived from its British colonial past, and development of its own after independence; Customary law, derived from indigenous traditional norms and practice, including the dispute resolution meetings of pre-colonial Yorubaland secret societies such as the Oyo Mesi and Ogboni, as well as the Ekpe and Okonko of Igboland and Ibibioland; Sharia law, used only in the predominantly Muslim northern states of the country. It is an Islamic legal system that had been used long before the colonial administration. The laws of Nigeria are written down, meaning that Nigeria practises written constitution. The current written constitution of Nigeria is the 1999 constitution as amended. Military The Nigerian military is charged with protecting the Federal Republic of Nigeria, promoting Nigeria's global security interests, and supporting peacekeeping efforts, especially in West Africa. This is in support of the doctrine sometimes called Pax Nigeriana. The Nigerian Military consists of an army, a navy, and an air force. The military in Nigeria has played a major role in the country's history since independence. Various juntas have seized control of the country and ruled it through most of its history. Its last period of military rule ended in 1999 following the sudden death of Sani Abacha in 1998. His successor, Abdulsalam Abubakar, handed over power to the democratically elected government of Olusegun Obasanjo the next year. As Africa's most populated country, Nigeria has repositioned its military as a peacekeeping force on the continent. Since 1995, the Nigerian military, through ECOMOG mandates, has been deployed as peacekeepers in Liberia (1997), Ivory Coast (1997–1999), and Sierra Leone (1997–1999). Under an African Union mandate, it has stationed forces in Sudan's Darfur region to try to establish peace. The Nigerian military has been deployed across West Africa, curbing terrorism in countries like Mali, Senegal, Chad, and Cameroon, as well as dealing with the Mali War, and getting Yahya Jammeh out of power in 2017. Foreign relations Upon gaining independence in 1960, Nigeria made African unity the centrepiece of its foreign policy and played a leading role in the fight against the apartheid government in South Africa. One exception to the African focus was Nigeria's close relationship developed with Israel throughout the 1960s. Israel sponsored and oversaw the construction of Nigeria's parliament buildings. Nigeria's foreign policy was put to the test in the 1970s after the country emerged united from its civil war. It supported movements against white minority governments in the Southern Africa sub-region. Nigeria backed the African National Congress by taking a committed tough line about the South African government and their military actions in southern Africa. Nigeria was a founding member of the Organisation for African Unity (now the African Union) and has tremendous influence in West Africa and Africa on the whole. Nigeria founded regional cooperative efforts in West Africa, functioning as the standard-bearer for the Economic Community of West African States (ECOWAS) and ECOMOG (especially during the Liberia and Sierra Leone civil wars) - which are economic and military organizations, respectively. With this Africa-centred stance, Nigeria readily sent troops to the Congo at the behest of the United Nations shortly after independence (and has maintained membership since that time). Nigeria also supported several Pan-African and pro-self government causes in the 1970s, including garnering support for Angola's MPLA, SWAPO in Namibia, and aiding opposition to the minority governments of Portuguese Mozambique, and Rhodesia. Nigeria retains membership in the Non-Aligned Movement. In late November 2006, it organised an Africa-South America Summit in Abuja to promote what some attendees termed "South-South" linkages on a variety of fronts. Nigeria is also a member of the International Criminal Court and the Commonwealth of Nations. It was temporarily expelled from the latter in 1995 when ruled by the Abacha regime. Nigeria has remained a key player in the international oil industry since the 1970s and maintains membership in OPEC, which it joined in July 1971. Its status as a major petroleum producer figures prominently in its sometimes volatile international relations with developed countries, notably the United States, and with developing countries. Since 2000, Chinese–Nigerian trade relations have risen exponentially. There has been an increase in total trade of over 10,384 million dollars between the two nations from 2000 to 2016. However, the structure of the Chinese–Nigerian trade relationship has become a major political issue for the Nigerian state. This is illustrated by the fact that Chinese exports account for around 80 per cent of total bilateral trade volumes. This has resulted in a serious trade imbalance, with Nigeria importing ten times more than it exports to China. Subsequently, Nigeria's economy is becoming over-reliant on cheap imports to sustain itself, resulting in a clear decline in Nigerian industry under such arrangements. Continuing its Africa-centred foreign policy, Nigeria introduced the idea of a single currency for West Africa known as the Eco under the presumption that it would be led by the naira. But on December 21, 2019, Ivorian President Alassane Ouattara, Emmanuel Macron, and multiple other UEMOA states announced that they would merely rename the CFA franc instead of replacing the currency as originally intended. As of 2020, the Eco currency has been delayed to 2025. Administrative divisions Nigeria is divided into thirty-six states and one Federal Capital Territory, which are further sub-divided into 774 local government areas. In some contexts, the states are aggregated into six geopolitical zones: North West, North East, North Central, South West, South East, and South South. Nigeria has five cities with a population of over a million (from largest to smallest): Lagos, Kano, Ibadan, Benin City and Port Harcourt. Lagos is the largest city in Africa, with a population of over 12 million in its urban area. Geography Nigeria is located in western Africa on the Gulf of Guinea and has a total area of , making it the world's 32nd-largest country. Its borders span , and it shares borders with Benin (), Niger (), Chad (), and Cameroon (including the separatist Ambazonia) . Its coastline is at least . Nigeria lies between latitudes 4° and 14°N, and longitudes 2° and 15°E. The highest point in Nigeria is Chappal Waddi at . The main rivers are the Niger and the Benue, which converge and empty into the Niger Delta. This is one of the world's largest river deltas and the location of a large area of Central African mangroves. Nigeria's most expansive topographical region is that of the valleys of the Niger and Benue river valleys (which merge and form a Y-shape). To the southwest of the Niger is a "rugged" highland. To the southeast of the Benue are hills and mountains, which form the Mambilla Plateau, the highest plateau in Nigeria. This plateau extends through the border with Cameroon, where the montane land is part of the Bamenda Highlands of Cameroon. Climate Nigeria has a varied landscape. The far south is defined by its tropical rainforest climate, where annual rainfall is per year. In the southeast stands the Obudu Plateau. Coastal plains are found in both the southwest and the southeast. Mangrove swamps are found along the coast. The area near the border with Cameroon close to the coast is rich rainforest and part of the Cross-Sanaga-Bioko coastal forests ecoregion, an important centre for biodiversity. It is a habitat for the drill primate, which is found in the wild only in this area and across the border in Cameroon. The areas surrounding Calabar, Cross River State, also in this forest, are believed to contain the world's largest diversity of butterflies. The area of southern Nigeria between the Niger and the Cross Rivers has lost most of its forest because of development and harvesting by increased population, with it being replaced by grassland. Everything in between the far south and the far north is savannah (insignificant tree cover, with grasses and flowers located between trees). Rainfall is more limited to between per year. The savannah zone's three categories are Guinean forest-savanna mosaic, Sudan savannah, and Sahel savannah. Guinean forest-savanna mosaic is plains of tall grass interrupted by trees. Sudan savannah is similar but with shorter grasses and shorter trees. Sahel savannah consists of patches of grass and sand, found in the northeast. In the Sahel region, rain is less than per year, and the Sahara Desert is encroaching. In the dry northeast corner of the country lies Lake Chad, which Nigeria shares with Niger, Chad and Cameroon. Plant ecology Nigeria has numerous tree species, of which the majority of them are native while few are exotic. A high percentage of man-made forests in the country is dominated by exotic species. This culminated from the assumption that exotic trees are fast-growing. However, studies have also investigated the growth of indigenous trees in with that of exotic species. Due to overexploitation, the remaining natural ecosystems and primary forests in Nigeria are restricted to the protected areas which include one biosphere reserve, seven national parks, one World Heritage site, 12 Strict Nature Reserves (SNRs), 32 game reserves/wildlife sanctuaries, and hundreds of forest reserves. These are in addition to several ex-situ conservation sites such as arboreta, botanical gardens, zoological gardens, and gene banks managed by several tertiary and research institutions Many countries in Africa are affected by Invasive Alien Species (IAS). In 2004, the IUCN–World Conservation Union identified 81 IAS in South Africa, 49 in Mauritius, 37 in Algeria and Madagascar, 35 in Kenya, 28 in Egypt, 26 in Ghana and Zimbabwe, and 22 in Ethiopia. However, very little is known about IAS in Nigeria, with most technical reports and literature reporting fewer than 10 invasive plants in the country. Aside from plant invaders, Rattus rattus and Avian influenza virus were also considered IAS in Nigeria. The initial entry of IAS into Nigeria was mainly through exotic plant introductions by the colonial rulers either for forest tree plantations or for ornamental purposes. The entry of exotic plants into Nigeria during the post-independence era was encouraged by increasing economic activity, the commencement of commercial oil explorations, the introduction through ships, and the introduction of ornamental plants by commercial floriculturists. In the semi-arid and dry sub-humid savannas of West Africa, including Nigeria, numerous species of herbaceous dicots especially from the genera Crotalaria, Alysicarpus, Cassia and Ipomea are known to be widely used in livestock production. Quite often they are plucked or cut and fed either as fresh or conserved fodders. The utilization of these and many other herbs growing naturally within the farm environment is opportunistic. Many other species native to Nigeria, including soybean and its varieties, serve as an important source of oil and protein in this region. There are also many plants with medicinal purposes that are used to aid the therapy in many organs. Some of these vegetations include Euphorbiaceae, which serve the purpose of aiding malaria, gastrointestinal disorders respectively and many other infections. Different stress factors such as droughts, low soil nutrients and susceptibility to pests have contributed to Maize plantations being an integral part of agriculture in this region. As industrialization has increased, it has also put species of trees in the forest at risk of air pollution and studies have shown that in certain parts of Nigeria, trees have shown tolerance and grow in areas that have a significant amount of air pollution Environmental issues Nigeria's Delta region, home of the large oil industry, experiences serious oil spills and other environmental problems, which has caused conflict in the Delta region. Waste management including sewage treatment, the linked processes of deforestation and soil degradation, and climate change or global warming are the major environmental problems in Nigeria. Waste management presents problems in a megacity like Lagos and other major Nigerian cities which are linked with economic development, population growth and the inability of municipal councils to manage the resulting rise in industrial and domestic waste. This waste management problem is also attributable to unsustainable environmental management lifestyles of Kubwa community in the Federal Capital Territory, where there are habits of indiscriminate disposal of waste, dumping of waste along or into the canals, sewerage systems that are channels for water flows, and the like. Haphazard industrial planning, increased urbanisation, poverty and lack of competence of the municipal government are seen as the major reasons for high levels of waste pollution in major cities of the country. Some of the solutions have been disastrous to the environment, resulting in untreated waste being dumped in places where it can pollute waterways and groundwater. In 2005, Nigeria had the highest rate of deforestation in the world, according to the Food and Agriculture Organization of the United Nations. That year, 12.2%, the equivalent of 11,089,000 hectares had been forested in the country. Between 1990 and 2000, Nigeria lost an average of 409,700 hectares of forest every year equal to an average annual deforestation rate of 2.4%. Between 1990 and 2005, in total Nigeria lost 35.7% of its forest cover or around 6,145,000 hectares. Nigeria had a 2019 Forest Landscape Integrity Index mean score of 6.2/10, ranking it 82nd globally out of 172 countries. In the year 2010, thousands of people were inadvertently exposed to lead-containing soil from informal gold mining within the northern state of Zamfara. While estimates vary, it is thought that upwards of 400 children died of acute lead poisoning, making this perhaps the largest lead poisoning fatality outbreak ever encountered. Economy Nigeria's mixed economy is the largest in Africa, the 26th-largest in the world by nominal GDP, and 25th-largest by PPP. It is a lower-middle-income economy, with its abundant supply of natural resources, well-developed financial, legal, communications, transport sectors and Nigerian Stock Exchange. Economic development has been hindered by years of military rule, corruption, and mismanagement. The restoration of democracy and subsequent economic reforms have successfully put Nigeria back on track towards achieving its full economic potential. Next to petroleum, the second-largest source of foreign exchange earnings for Nigeria are remittances sent home by Nigerians living abroad. During the oil boom of the 1970s, Nigeria accumulated a significant foreign debt to finance major infrastructural investments. With the fall of oil prices during the 1980s oil glut, Nigeria struggled to keep up with its loan payments and eventually defaulted on its principal debt repayments, limiting repayment to the interest portion of the loans. Arrears and penalty interest accumulated on the unpaid principal, which increased the size of the debt. After negotiations by the Nigerian authorities, in October 2005 Nigeria and its Paris Club creditors reached an agreement under which Nigeria repurchased its debt at a discount of approximately 60%. Nigeria used part of its oil profits to pay the residual 40%, freeing up at least $1.15 billion annually for poverty reduction programmes. Nigeria made history in April 2006 by becoming the first African country to completely pay off its debt (estimated $30 billion) owed to the Paris Club. Agriculture , about 30% of Nigerians are employed in agriculture. Agriculture used to be the principal foreign exchange earner of Nigeria. Major crops include beans, sesame, cashew nuts, cassava, cocoa beans, groundnuts, gum arabic, kolanut, maize (corn), melon, millet, palm kernels, palm oil, plantains, rice, rubber, sorghum, soybeans and yams. Cocoa is the leading non-oil foreign exchange earner. Rubber is the second-largest non-oil foreign exchange earner. Before the Nigerian civil war, Nigeria was self-sufficient in food. Agriculture has failed to keep pace with Nigeria's rapid population growth, and Nigeria now relies upon food imports to sustain itself. The Nigerian government promoted the use of inorganic fertilizers in the 1970s. In August 2019, Nigeria closed its border with Benin and other neighbouring countries to stop rice smuggling into the country as part of efforts to boost local production. Petroleum and mining Nigeria is the 12th largest producer of petroleum in the world, the 8th largest exporter, and has the 10th largest proven reserves. Petroleum plays a large role in the Nigerian economy, accounting for 40% of GDP and 80% of government earnings. However, agitation for better resource control in the Niger Delta, its main oil-producing region, has led to disruptions in oil production and prevents the country from exporting at 100% capacity. The Niger Delta Nembe Creek oil field was discovered in 1973 and produces from middle Miocene deltaic sandstone-shale in an anticline structural trap at a depth of . In June 2013, Shell announced a strategic review of its operations in Nigeria, hinting that assets could be divested. While many international oil companies have operated there for decades, by 2014 most were making moves to divest their interests, citing a range of issues including oil theft. In August 2014, Shell said it was finalising its interests in four Nigerian oil fields. Nigeria has a total of 159 oil fields and 1,481 wells in operation according to the Department of Petroleum Resources. The most productive region of the nation is the coastal Niger Delta Basin in the Niger Delta or "south-south" region which encompasses 78 of the 159 oil fields. Most of Nigeria's oil fields are small and scattered, and as of 1990, these small fields accounted for 62.1% of all Nigerian production. This contrasts with the sixteen largest fields which produced 37.9% of Nigeria's petroleum at that time. In addition to its petroleum resources, Nigeria also has a wide array of underexploited mineral resources which include natural gas, coal, bauxite, tantalite, gold, tin, iron ore, limestone, niobium, lead and zinc. Despite huge deposits of these natural resources, the mining industry in Nigeria is still in its infancy. Services and tourism Nigeria has a highly developed financial services sector, with a mix of local and international banks, asset management companies, brokerage houses, insurance companies and brokers, private equity funds and investment banks. Nigeria has one of the fastest-growing telecommunications markets in the world, with major emerging market operators (like MTN, 9mobile, Airtel and Globacom) basing their largest and most profitable centres in the country. Nigeria's ICT sector has experienced a lot of growth, representing 10% of the nation's GDP in 2018 as compared to just 1% in 2001. Lagos is regarded as one of the largest technology hubs in Africa with its thriving tech ecosystem. Several startups like Paystack, Interswitch, Bolt and Piggyvest are leveraging technology to solve issues across different sectors. Tourism in Nigeria centres largely on events, because of the country's ample amount of ethnic groups, but also includes rain forests, savannah, waterfalls, and other natural attractions. Abuja is home to several parks and green areas. The largest, Millennium Park, was designed by architect Manfredi Nicoletti and officially opened in December 2003. After the re-modernization project achieved by the administration of Governor Raji Babatunde Fashola, Lagos is gradually becoming a major tourist destination. Lagos is currently taking steps to become a global city. The 2009 Eyo carnival (a yearly festival originating from Iperu Remo, Ogun State) was a step toward world city status. Currently, Lagos is primarily known as a business-oriented and fast-paced community. Lagos has become an important location for African and black cultural identity. Many festivals are held in Lagos; festivals vary in offerings each year and may be held in different months. Some of the festivals are Festac Food Fair held in Festac Town Annually, Eyo Festival, Lagos Black Heritage Carnival, Lagos Carnival, Eko International Film Festival, Lagos Seafood Festac Festival, LAGOS PHOTO Festival and the Lagos Jazz Series, which is a unique franchise for high-quality live music in all genres with a focus on jazz. Established in 2010, the event takes place over a 3 to 5 days period at selected high-quality outdoor venues. The music is as varied as the audience itself and features a diverse mix of musical genres from rhythm and blues to soul, Afrobeat, hip hop, bebop, and traditional jazz. The festivals provide entertainment of dance and song to add excitement to travellers during a stay in Lagos. Lagos has sandy beaches by the Atlantic Ocean, including Elegushi Beach and Alpha Beach. Lagos also has many private beach resorts including Inagbe Grand Beach Resort and several others in the outskirts. Lagos has a variety of hotels ranging from three-star to five-star hotels, with a mixture of local hotels such as Eko Hotels and Suites, Federal Palace Hotel and franchises of multinational chains such as Intercontinental Hotel, Sheraton, and Four Points by Hilton. Other places of interest include the Tafawa Balewa Square, Festac town, The Nike Art Gallery, Freedom Park, and the Cathedral Church of Christ. Manufacturing and technology Nigeria has a manufacturing industry that includes leather and textiles (centred in Kano, Abeokuta, Onitsha, and Lagos). Nigeria currently has an indigenous auto manufacturing company, Innoson Vehicle Manufacturing located in Nnewi. It produces buses and SUVs. Car manufacturing (for the French car manufacturer Peugeot as well as for the English truck manufacturer Bedford, now a subsidiary of General Motors), T-shirts, plastics and processed food. In this regard, some foreign vehicle manufacturing companies like Nissan have made known their plans to have manufacturing plants in Nigeria. Ogun is considered to be Nigeria's current industrial hub, as most factories are located in Ogun and | were aged 65 years or older. The median age in 2017 was 18.4 years. Nigeria is the seventh most populous country in the world. The birth rate is 35.2-births/1,000 population and the death rate is 9.6 deaths/1,000 population as of 2017, while the total fertility rate is 5.07 children born/woman. Nigeria's population increased by 57 million from 1990 to 2008, a 60% growth rate in less than two decades. Nigeria is the most populous country in Africa and accounts for about 17% of the continent's total population as of 2017; however, exactly how populous is a subject of speculation. National census results in the past few decades have been disputed. The results of the most recent census were released in December 2006 and gave a population of 140,003,542. The only breakdown available was by gender: males numbered 71,709,859, females numbered 68,293,008. According to the United Nations, Nigeria has been undergoing explosive population growth and has one of the highest growth and fertility rates in the world. By their projections, Nigeria is one of eight countries expected to account collectively for half of the world's total population increase in 2005–2050. The UN estimates that by 2100 the Nigerian population will be between 505 million and 1.03 billion people (middle estimate: 730 million). In 1950, Nigeria had only 33 million people. In 2012, President Goodluck Jonathan said Nigerians should limit their number of children. Millions of Nigerians have emigrated during times of economic hardship, primarily to Europe, North America and Australia. It is estimated that over a million Nigerians have emigrated to the United States and constitute the Nigerian American populace. Individuals in many such Diasporic communities have joined the "Egbe Omo Yoruba" society, a national association of Yoruba descendants in North America. Nigeria's largest city is Lagos. Lagos has grown from about 300,000 in 1950 to an estimated 13.4 million in 2017. Ethnic groups Nigeria has more than 250 ethnic groups, with varying languages and customs, creating a country of rich ethnic diversity. The three largest ethnic groups are the Hausa, Yoruba and Igbo, together accounting for more than 70% of the population, while the Edo, Ijaw, Fulɓe, Kanuri, Urhobo-Isoko, Ibibio, Ebira, Nupe, Gbagyi, Jukun, Igala, Idoma and Tiv comprise between 25 and 30%; other minorities make up the remaining 5%. The Middle Belt of Nigeria is known for its diversity of ethnic groups, including the Atyap, Berom, Goemai, Igala, Kofyar, Pyem, and Tiv. The official population count of each of Nigeria's ethnicities is disputed as members of different ethnic groups believe the census is rigged to give a particular group (usually believed to be northern groups) numerical superiority. There are small minorities of British, American, Indian, Chinese (est. 50,000), white Zimbabwean, Japanese, Greek, Syrian and Lebanese immigrants. Immigrants also include those from other West African or East African nations. These minorities mostly reside in major cities such as Lagos and Abuja, or the Niger Delta as employees for the major oil companies. Several Cubans settled in Nigeria as political refugees following the Cuban Revolution. In the middle of the 19th century, several ex-slaves of Afro-Cuban and Afro-Brazilian descent and emigrants from Sierra Leone established communities in Lagos and other regions of Nigeria. Many ex-slaves came to Nigeria following the emancipation of slaves in the Americas. Many of the immigrants, sometimes called Saro (immigrants from Sierra Leone) and Amaro (ex-slaves from Brazil) later became prominent merchants and missionaries in these cities. Languages 521 languages have been spoken in Nigeria; nine of them are extinct. In some areas of Nigeria, ethnic groups speak more than one language. The official language of Nigeria, English, was chosen to facilitate the cultural and linguistic unity of the country, owing to the influence of British colonisation which ended in 1960. Many French speakers from surrounding countries have influenced the English spoken in the border regions of Nigeria and some Nigerian citizens have become fluent enough in French to work in the surrounding countries. The French spoken in Nigeria may be mixed with some native languages. French may also be mixed with English. The major languages spoken in Nigeria represent three major families of languages of Africa: the majority are Niger-Congo languages, such as Igbo, Yoruba, Ijaw, Fulfulde, Ogoni, and Edo. Kanuri, spoken in the northeast, primarily in Borno and Yobe State, is part of the Nilo-Saharan family, and Hausa is an Afroasiatic language. Even though most ethnic groups prefer to communicate in their languages, English as the official language is widely used for education, business transactions and official purposes. English as a first language is used by only a small minority of the country's urban elite, and it is not spoken at all in some rural areas. Hausa is the most widely spoken of the three main languages spoken in Nigeria. With the majority of Nigeria's populace in the rural areas, the major languages of communication in the country remain indigenous languages. Some of the largest of these, notably Yoruba and Igbo, have derived standardised languages from several different dialects and are widely spoken by those ethnic groups. Nigerian Pidgin English, often known simply as "Pidgin" or "Broken" (Broken English), is also a popular lingua franca, though with varying regional influences on dialect and slang. The pidgin English or Nigerian English is widely spoken within the Niger Delta Region. Religion Nigeria is a religiously diverse society, with Islam and Christianity being the most widely professed religions. Nigerians are nearly equally divided into Muslims and Christians, with a tiny minority of adherents of traditional African religions and other religions. The Christian share of Nigeria's population is on the decline because of the lower fertility rate compared to Muslims in the north. As in other parts of Africa where Islam and Christianity are dominant, religious syncretism with the traditional African religions is common. A 2012 report on religion and public life by the Pew Research Center stated that in 2010, 49.3 per cent of Nigeria's population was Christian, 48.8 per cent was Muslim, and 1.9 per cent were followers of indigenous and other religions or unaffiliated. However, in a report released by Pew Research Center in 2015, the Muslim population was estimated to be 50%, and by 2060, according to the report, Muslims will account for about 60% of the country. The 2010 census of Association of Religion Data Archives has also reported that 48.8% of the total population was Christian, slightly larger than the Muslim population of 43.4%, while 7.5% were members of other religions. However, these estimates should be taken with caution because sample data is mostly collected from major urban areas in the south, which are predominantly Christian. Islam dominates North-Western Nigeria (Hausa, Fulani and others), with 99% Muslim, and a good portion of Northern Eastern Nigeria (Kanuri, Fulani and other groups) Nigeria. In the west, the Yoruba tribe is predominantly split between Muslims and Christians with 10% adherents of traditional religions. Protestant and locally cultivated Christianity are widely practised in Western areas, while Roman Catholicism is a more prominent Christian feature of South Eastern Nigeria. Both Roman Catholicism and Protestantism are observed in the Ibibio, Anaang, Efik, Ijo and Ogoni lands of the south. The Igbos (predominant in the east) and the Ijaw (south) are 98% Christian, with 2% practising traditional religions. The middle belt of Nigeria contains the largest number of minority ethnic groups in Nigeria, who were found to be mostly Christians and members of traditional religions, with a small proportion of Muslims. Nigeria has the largest Muslim population in sub-Saharan Africa. The vast majority of Muslims in Nigeria are Sunni belonging to the Maliki school of jurisprudence; however, a sizeable minority also belongs to Shafi Madhhab. A large number of Sunni Muslims are members of Sufi brotherhoods. Most Sufis follow the Qadiriyya, Tijaniyyah or the Mouride movements. A significant Shia minority exists. Some northern states have incorporated Sharia law into their previously secular legal systems, which has brought about some controversy. Kano State has sought to incorporate Sharia law into its constitution. The majority of Quranists follow the Kalo Kato or Quraniyyun movement. There are also Ahmadiyya and Mahdiyya minorities, as well as followers of the Baháʼí Faith. Among Christians, the Pew Research survey found that 74% were Protestant, 25% were Catholic, and 1% belonged to other Christian denominations, including a small Orthodox Christian community. Leading Protestant churches in the country include the Church of Nigeria of the Anglican Communion, the Assemblies of God Church, the Nigerian Baptist Convention and The Synagogue, Church Of All Nations. Since the 1990s, there has been significant growth in many other churches, independently started in Africa by Africans, particularly the evangelical Protestant ones. These include the Redeemed Christian Church of God, Winners' Chapel, Christ Apostolic Church (the first Aladura Movement in Nigeria), Living Faith Church Worldwide, Deeper Christian Life Ministry, Evangelical Church of West Africa, Mountain of Fire and Miracles, Christ Embassy, Lord's Chosen Charismatic Revival Movement, Celestial Church of Christ, and Dominion City. In addition, The Church of Jesus Christ of Latter-day Saints, the Aladura Church, the Seventh-day Adventist and various indigenous churches have also experienced growth. The Yoruba area contains a large Anglican population, while Igboland is a mix of Roman Catholics, Protestants, and a small population of Igbo Jews. The Edo area is composed predominantly of members of the Assemblies of God, which was introduced into Nigeria by Augustus Ehurie Wogu and his associates at Old Umuahia. Nigeria has become an African hub for the Grail Movement and the Hare Krishnas, and the largest temple of the Eckankar religion is in Port Harcourt, Rivers State, with a total capacity of 10,000. Health Health care delivery in Nigeria is a concurrent responsibility of the three tiers of government in the country, and the private sector. Nigeria has been reorganising its health system since the Bamako Initiative of 1987, which formally promoted community-based methods of increasing accessibility of drugs and health care services to the population, in part by implementing user fees. The new strategy dramatically increased accessibility through community-based health care reform, resulting in more efficient and equitable provision of services. A comprehensive approach strategy was extended to all areas of health care, with subsequent improvement in the health care indicators and improvement in health care efficiency and cost. HIV/AIDS rate in Nigeria is much lower compared to the other African nations such as Botswana or South Africa whose prevalence (percentage) rates are in the double digits. , the HIV prevalence rate among adults ages 15–49 was 1.5 per cent. The life expectancy in Nigeria is 54.7 years on average, and 71% and 39% of the population have access to improved water sources and improved sanitation, respectively. , the infant mortality is 74.2 deaths per 1,000 live births. In 2012, a new bone marrow donor program was launched by the University of Nigeria to help people with leukaemia, lymphoma, or sickle cell disease to find a compatible donor for a life-saving bone marrow transplant, which cures them of their conditions. Nigeria became the second African country to have successfully carried out this surgery. In the 2014 Ebola outbreak, Nigeria was the first country to effectively contain and eliminate the Ebola threat that was ravaging three other countries in the West African region; the unique method of contact tracing employed by Nigeria became an effective method later used by countries such as the United States when Ebola threats were discovered. The Nigerian health care system is continuously faced with a shortage of doctors known as "brain drain", because of emigration by skilled Nigerian doctors to North America and Europe. In 1995, an estimated 21,000 Nigerian doctors were practising in the United States alone, which is about the same as the number of doctors working in the Nigerian public service. Retaining these expensively trained professionals has been identified as one of the goals of the government. Education Education in Nigeria is overseen by the Ministry of Education. Local authorities take responsibility for implementing policy for state-controlled public education and state schools at a regional level. The education system is divided into kindergarten, primary education, secondary education and tertiary education. After the 1970s oil boom, tertiary education was improved so it would reach every subregion of Nigeria. 68% of the Nigerian population is literate, and the rate for men (75.7%) is higher than that for women (60.6%). Nigeria provides free, government-supported education, but attendance is not compulsory at any level, and certain groups, such as nomads and the handicapped, are under-served. The education system consists of six years of primary school, three years of junior secondary school, three years of senior secondary school, and four, five or six years of university education leading to a bachelor's degree. The government has majority control of university education. Tertiary education in Nigeria consists of universities (public and private), polytechnics, monotechnics, and colleges of education. The country has a total of 138 universities, with 40 federally owned, 39 state-owned, and 59 privately owned. Nigeria was ranked 117th in the Global Innovation Index in 2020, down from 114th in 2019. Crime Nigeria is home to a substantial network of organised crime, active especially in drug trafficking, shipping heroin from Asian countries to Europe and America; and cocaine from South America to Europe and South Africa. Various Nigerian confraternities or student "campus cults" are active in both organised crime and political violence as well as providing a network of corruption within Nigeria. As confraternities have extensive connections with political and military figures, they offer excellent alumni networking opportunities. The Supreme Vikings Confraternity, for example, boasts that twelve members of the Rivers State House of Assembly are cult members. In lower levels of society, there are the "area boys", organised gangs mostly active in Lagos who specialise in a mugging and small-scale drug dealing. Gang violence in Lagos resulted in 273 civilians and 84 policemen being killed from August 2000 to May 2001. There is some piracy in the Gulf of Guinea, with attacks directed at all types of vessels. Consistent with the rise of Nigeria as an increasingly dangerous hot spot, 28 of the 30 seafarers kidnapped globally between January and June 2013 were in Nigeria. Internationally, Nigeria is infamous for a form of bank fraud dubbed 419, a type of advance-fee scam (named after Section 419 of the Nigerian Penal Code) along with the "Nigerian scam", a form of confidence trick practised by individuals and criminal syndicates. These scams involve a complicit Nigerian bank (the laws being set up loosely to allow it) and a scammer who claims to have money he needs to obtain from that bank. The victim is talked into exchanging bank account information on the premise that the money will be transferred to them and they will get to keep a cut. In reality, money is taken out instead, or large fees (which seem small in comparison with the imaginary wealth to be gained) are deducted. In 2003, the Nigerian Economic and Financial Crimes Commission was created to combat this and other forms of organised financial crime, and in some cases, it has succeeded in bringing the crime bosses to justice and even managing to return the stolen money to victims. Nigeria has been pervaded by political corruption. Nigeria was ranked 136 out of 182 countries in Transparency International's 2014 Corruption Perceptions Index. More than $400 billion were stolen from the treasury by Nigeria's leaders between 1960 and 1999. In 2015, incumbent President Muhammadu Buhari said corrupt officials have stolen $150 billion from Nigeria in the last 10 years. Poverty Nigeria poverty rates have gone down significantly in the 2010s because of economic growth. The World Bank states Nigeria has had a 7.4% economic growth in July 2019 which has been their highest yet since the gross domestic product rate decreased to 2%. While as of May 4, 2020, 40% of Nigerians live in poverty, this number still shows the growth of the developing country, with a previously counted 61% of the population living in poverty in 2012. Having made their plans to reduce this number, Nigeria has presented a plan to the World Bank Group to lower this number tremendously. Government instability, which affects the rate at which citizens are employed, is the major reason for the poverty levels being higher in certain periods. Human rights Nigeria's human rights record remains poor. According to the U.S. Department of State, the most significant human rights problems are the use of excessive force by security forces, impunity for abuses by security forces, arbitrary arrests, prolonged pretrial detention, judicial corruption and executive influence on the judiciary, rape, torture and other cruel, inhuman or degrading treatment of prisoners, detainees and suspects; harsh and life‑threatening prison and detention centre conditions; human trafficking for prostitution and forced labour, societal violence and vigilante killings, child labour, child abuse and child sexual exploitation, domestic violence, discrimination based on ethnicity, region and religion. Nigeria is a state party of the Convention on the Elimination of All Forms of Discrimination Against Women It also has signed the Maputo Protocol, an international treaty on women's rights, and the African Union Women's Rights Framework. Discrimination based on sex is a significant human rights issue. Forced marriages are common. Child marriage remains common in Northern Nigeria; 39% of girls are married before age 15, although the Marriage Rights Act banning marriage of girls below 18 years old was introduced on a federal level in 2008. There is rampant polygamy in Northern Nigeria. Submission of the wife to her husband and domestic violence are common. Women have fewer land rights. Maternal mortality was at 814 per 100,000 live births in 2015. Female genital mutilation is common, although a ban was implemented in 2015. In Nigeria, at least half a million suffer from vaginal fistula, largely as a result of lack of medical care. Early marriages can result in the fistula. Women face a large amount of inequality politically in Nigeria, being subjugated to a bias that is sexist and reinforced by socio-cultural, economic and oppressive ways. Women throughout the country were only politically emancipated in 1979. Yet husbands continue to dictate the votes for many women in Nigeria, which upholds the patriarchal system. Most workers in the informal sector are women. Women's representation in government since independence from Britain is very poor. Women have been reduced to sideline roles in appointive posts throughout all levels of government and still make up a tiny minority of elected officials. But nowadays with more education available to the public, Nigerian women are taking steps to have more active roles in the public, and with the help of different initiatives, more businesses are being started by women. Under the Shari'a penal code that applies to Muslims in twelve northern states, offences such as alcohol consumption, homosexuality, infidelity and theft carry harsh sentences, including amputation, lashing, stoning and long prison terms. According to a 2013 survey by the Pew Research Center, 98% of Nigerians believe homosexuality should not be accepted by society. Culture Cuisine Nigerian cuisine, like West African cuisine in general, is known for its richness and variety. Many different spices, herbs, and flavourings are used in conjunction with palm oil or groundnut oil to create deeply flavoured sauces and soups often made very hot with chilli peppers. Nigerian feasts are colourful and lavish, while aromatic market and roadside snacks cooked on barbecues or fried in oil are plentiful and varied. Fashion Nigeria is not only known for their many fashion textiles and garment pieces that are secret to their culture. They also outputted many fashion designers who have develop many techniques and business along the way. Lisa Folawiyo is known for her label, Jewel by Lisa, which launched in 2005, through her label she is a self-made Nigerian fashion designer. Her expertise is focusing traditional West African Fabrics with tailoring but using a modern Technique. In addition, she is known for her custom luxury prints where she likes to include nods to traditional African aesthetics. Lisa produces accessories such as jewelry and purses. Line J Label, her diffusion line, showcases the best of Nigerian culture by incorporating Afropop with tasteful urban designs. Shade Thomas (later Thomas-Fahm) as Nigeria's first widely recognized fashion designer in the 1990s. After learning about fashion design, she set up a shop at the Federal Palace Hotel. Thomas-Fahm developed a garment factory at the Yaba Industrial Estate. She specialized in the use of locally woven and dyed textiles. The simplistic fashions help her gain awareness and customers to her shop at home and Abroad. Outlets were not the only expertise, she also exported clothes to the U.S. Thomas-Fahm join international shows in Germany, Britain & The Netherlands. Duro Olowu, Nigerian born, however has Jamaican roots. Olowu is well known for his unique and colourful African prints. Olowu tends to incorporate the rich culture, spirit, and diversity of the Nigerian people with his textiles & prints. Olowu's fame & recognition has landed him with working top tier people such as Michelle Obama, Solange Knowles, Uma Thurman, and Linda Evangelista. However, his fashion aspiration were to his interest since he was six. Festival There are many festivals in Nigeria, some of which date to the period before the arrival of the major religions in this ethnically and culturally diverse society. The main Muslim and Christian festivals are often celebrated in ways that are unique to Nigeria or unique to the people of a locality. The Nigerian Tourism Development Corporation has been working with the states to upgrade the traditional festivals, which may become important sources of tourism revenue. Literature Nigerian citizens have authored many influential works of post-colonial literature in the English language. Nigeria's best-known writers are Wole Soyinka, the first African Nobel Laureate in Literature, and Chinua Achebe, best known for the novel Things Fall Apart (1958) and his controversial critique of Joseph Conrad. Other Nigerian writers and poets who are well known internationally include John Pepper Clark, Ben Okri, Cyprian Ekwensi, Buchi Emecheta, Helon Habila, T. M. Aluko, Isaac Delano, Chimamanda Ngozi Adichie, Daniel O. Fagunwa, Femi Osofisan and Ken Saro-Wiwa, who was executed in 1995 by the military regime. Critically acclaimed writers of a younger generation include Adaobi Tricia Nwaubani, Chris Abani, Sefi Atta, Helon Habila, Helen Oyeyemi, Nnedi Okorafor, Kachi A. Ozumba, Sarah Ladipo Manyika, and Chika Unigwe. Music Nigeria has had a huge role in the development of various genres of African music, including West African Highlife, Palm-wine music, JuJu, Afrobeat, Afrobeats, which fuses native rhythms with techniques that have been linked to the Congo, Brazil, Cuba, Jamaica, United States and worldwide. Many late 20th-century musicians such as Fela Kuti have famously fused cultural elements of various indigenous music with African-American jazz and soul to form Afrobeat which has in turn influenced hip hop music. JuJu music, which is percussion music fused with traditional music from the Yoruba nation and made famous by King Sunny Adé, is from Nigeria. Fuji music, a Yoruba percussion style, was created and popularised by Mr Fuji, |
the succeeding decades, but another drought occurred in the 1790s, again weakening the state. Ecological and political instability provided the background for the jihad of Usman dan Fodio. The military rivalries of the Hausa states strained the region's economic resources at a time when drought and famine undermined farmers and herders. Many Fulani moved into Hausaland and Borno, and their arrival increased tensions because they had no loyalty to the political authorities, who saw them as a source of increased taxation. By the end of the 18th century, some Muslim ulema began articulating the grievances of the common people. Efforts to eliminate or control these religious leaders only heightened the tensions, setting the stage for jihad. According to the Encyclopedia of African History, "It is estimated that by the 1890s the largest slave population of the world, about 2 million people, was concentrated in the territories of the Sokoto Caliphate. The use of slave labour was extensive, especially in agriculture." Northern kingdoms of the Sahel Trade is the key to the emergence of organised communities in the sahelian portions of Nigeria. Prehistoric inhabitants adjusting to the encroaching desert were widely scattered by the third millennium BC, when the desiccation of the Sahara began. Trans-Saharan trade routes linked the western Sudan with the Mediterranean since the time of Carthage and with the Upper Nile from a much earlier date, establishing avenues of communication and cultural influence that remained open until the end of the 19th century. By these same routes, Islam made its way south into West Africa after the 9th century. By then a string of dynastic states, including the earliest Hausa states, stretched into western and central Sudan. The most powerful of these states were Ghana, Gao, and Kanem, which were not within the boundaries of modern Nigeria but which influenced the history of the Nigerian savanna. Ghana declined in the 11th century but was succeeded by the Mali Empire which consolidated much of western Sudan in the 13th century. Following the breakup of Mali, a local leader named Sonni Ali (1464–1492) founded the Songhai Empire in the region of middle Niger and western Sudan and took control of the trans-Saharan trade. Sonni Ali seized Timbuktu in 1468 and Djenné in 1473, building his regime on trade revenues and the cooperation of Muslim merchants. His successor Askia Muhammad Ture (1493–1528) made Islam the official religion, built mosques, and brought Muslim scholars, including al-Maghili (d.1504), the founder of an important tradition of Sudanic African Muslim scholarship, to Gao. Although these western empires had little political influence on the Nigerian savanna before 1500 they had a strong cultural and economic impact that became more pronounced in the 16th century, especially because these states became associated with the spread of Islam and trade. Throughout the 16th-century much of northern Nigeria paid homage to Songhai in the west or to Borno, a rival empire in the east. The Golden Age During the 14th and 16th centuries, the demand for gold increased due to European and Islamic states wanting to change their currencies to gold. This led to an increase in trans-Saharan Trade. Kanem–Bornu Empire Borno's history is closely associated with Kanem, which had achieved imperial status in the Lake Chad basin by the 13th century. Kanem expanded westward to include the area that became Borno. The mai (king) of Kanem and his court accepted Islam in the 11th century, as the western empires also had done. Islam was used to reinforce the political and social structures of the state although many established customs were maintained. Women, for example, continued to exercise considerable political influence. The mai employed his mounted bodyguard and an inchoate army of nobles to extend Kanem's authority into Borno. By tradition, the territory was conferred on the heir to the throne to govern during his apprenticeship. In the 14th century, however, dynastic conflict forced the then-ruling group and its followers to relocate in Borno, whereas a result the Kanuri emerged as an ethnic group in the late 14th and 15th centuries. The civil war that disrupted Kanem in the second half of the 14th century resulted in the independence of Borno. Borno's prosperity depended on the trans-Sudanic slave trade and the desert trade in salt and livestock. The need to protect its commercial interests compelled Borno to intervene in Kanem, which continued to be a theatre of war throughout the 15th century and into the 16th century. Despite its relative political weakness in this period, Borno's court and mosques under the patronage of a line of scholarly kings earned fame as centres of Islamic culture and learning. Hausa Kingdoms The Hausa Kingdoms were a collection of states started by the Hausa people, situated between the Niger River and Lake Chad. Their history is reflected in the Bayajidda legend, which describes the adventures of the Baghdadi hero Bayajidda culminating in the killing of the snake in the well of Daura and the marriage with the local queen magajiya Daurama. While the hero had a child with the queen, Bawo, and another child with the queen's maid-servant, Karbagari. Sarki mythology According to the Bayajidda legend, the Hausa states were founded by the sons of Bayajidda, a prince whose origin differs by tradition, but official canon records him as the person who married the last Kabara of Daura and heralded the end of the matriarchal monarchs that had erstwhile ruled the Hausa people. Contemporary historical scholarship views this legend as an allegory similar to many in that region of Africa that probably referenced a major event, such as a shift in ruling dynasties. Banza Bakwai According to the Bayajidda legend, the Banza Bakwai states were founded by the seven sons of Karbagari ("Town-seizer"), the unique son of Bayajidda and the slave-maid, Bagwariya. They are called the Banza Bakwai meaning Bastard or Bogus Seven on account of their ancestress' slave status. Zamfara (state inhabited by Hausa-speakers) Kebbi (state inhabited by Hausa-speakers) Yauri (also called Yawuri) Gwari (also called Gwariland) Kwararafa (the state of the Jukun people) Nupe (state of the Nupe people) Ilorin(was founded by the Yoruba) Hausa Bakwai The Hausa Kingdoms began as seven states founded according to the Bayajidda legend by the six sons of Bawo, the unique son of the hero and the queen Magajiya Daurama in addition to the hero's son, Biram or Ibrahim, of an earlier marriage. The states included only kingdoms inhabited by Hausa-speakers: Daura: Kano: Katsina Zaria (Zazzau) Gobir Rano Biram: Since the beginning of Hausa history, the seven states of Hausaland divided up production and labor activities in accordance with their location and natural resources. Kano and Rano were known as the "Chiefs of Indigo." Cotton grew readily in the great plains of these states, and they became the primary producers of cloth, weaving and dying it before sending it off in caravans to the other states within Hausaland and to extensive regions beyond. Biram was the original seat of government, while Zaria supplied labor and was known as the "Chief of Slaves." Katsina and Daura were the "Chiefs of the Market," as their geographical location accorded them direct access to the caravans coming across the desert from the north. Gobir, located in the west, was the "Chief of War" and was mainly responsible for protecting the empire from the invasive Kingdoms of Ghana and Songhai. Islam arrived at Hausaland along the caravan routes. The famous Kano Chronicle records the conversion of Kano's ruling dynasty by clerics from Mali, demonstrating that the imperial influence of Mali extended far to the east. Acceptance of Islam was gradual and was often nominal in the countryside where folk religion continued to exert a strong influence. Nonetheless, Kano and Katsina, with their famous mosques and schools, came to participate fully in the cultural and intellectual life of the Islamic world. The Fulani began to enter the Hausa country in the 13th century and by the 15th century, they were tending cattle, sheep, and goats in Borno as well. The Fulani came from the Senegal River valley, where their ancestors had developed a method of livestock management based on transhumance. Gradually they moved eastward, first into the centers of the Mali and Songhai empires and eventually into Hausaland and Borno. Some Fulbe converted to Islam as early as the 11th century and settled among the Hausa, from whom they became racially indistinguishable. There they constituted a devoutly religious, educated elite who made themselves indispensable to the Hausa kings as government advisers, Islamic judges, and teachers. Zenith The Hausa Kingdoms were first mentioned by Ya'qubi in the 9th century and they were by the 15th-century vibrant trading centers competing with Kanem–Bornu and the Mali Empire. The primary exports were slaves, leather, gold, cloth, salt, kola nuts, and henna. At various moments in their history, the Hausa managed to establish central control over their states, but such unity has always proven short. In the 11th-century, the conquests initiated by Gijimasu of Kano culminated in the birth of the first united Hausa Nation under Queen Amina, the Sultana of Zazzau but severe rivalries between the states led to periods of domination by major powers like the Songhai, Kanem and the Fulani. Fall Despite relatively constant growth, the Hausa states were vulnerable to aggression and, although the vast majority of its inhabitants were Muslim by the 16th century, they were attacked by Fulani jihadists from 1804 to 1808. In 1808 the Hausa Nation was finally conquered by Usman dan Fodio and incorporated into the Hausa-Fulani Sokoto Caliphate. Yoruba Historically the Yoruba people have been the dominant group on the west bank of the Niger. Their nearest linguistic relatives are the Igala who live on the opposite side of the Niger's divergence from the Benue, and from whom they are believed to have split about 2,000 years ago. The Yoruba were organized in mostly patrilineal groups that occupied village communities and subsisted on agriculture. From approximately the 8th century, adjacent village compounds called ile coalesced into numerous territorial city-states in which clan loyalties became subordinate to dynastic chieftains. Urbanisation was accompanied by high levels of artistic achievement, particularly in terracotta and ivory sculpture and in the sophisticated metal casting produced at Ife. The Yoruba are especially known for the Oyo Empire that dominated the region which held supremacy over other Yoruba nations like the Egba Kingdom, Ijebu Kingdom, and the Egbado. In its prime, they also dominated the Kingdom of Dahomey (now located in the modern day Republic of Benin). The Yoruba pay tribute to a pantheon composed of a Supreme Deity, Olorun and the Orisha. The Olorun is now called God in the Yoruba language. There are 400 deities called Orisha who perform various tasks. According to the Yoruba, Oduduwa is regarded as the ancestor of the Yoruba kings. According to one of the various myths about him, he founded Ife and dispatched his sons and daughters to establish similar kingdoms in other parts of what is today known as Yorubaland. The Yorubaland now consists of different tribes from different states which are located in the Southwestern part of the country, states like Lagos State, Oyo State, Ondo State, Osun State, Ekiti State and Ogun State, among others. Igbo Kingdom Nri Kingdom The Kingdom of Nri is considered to be the foundation of Igbo culture and the oldest Kingdom in Nigeria. Nri and Aguleri, where the Igbo creation myth originates, are in the territory of the Umueri clan, who trace their lineages back to the patriarchal king-figure, Eri. Eri's origins are unclear, though he has been described as a "sky being" sent by Chukwu (God). He has been characterized as having first given societal order to the people of Anambra. Archaeological evidence suggests that Nri hegemony in Igboland may go back as far as the 9th century, and royal burials have been unearthed dating to at least the 10th century. Eri, the god-like founder of Nri, is believed to have settled in the region around 948 with other related Igbo cultures following in the 13th century. The first Eze Nri (King of Nri), Ìfikuánim, followed directly after him. According to Igbo oral tradition, his reign started in 1043. At least one historian puts Ìfikuánim's reign much later, around 1225. The Kingdom of Nri was a religio-polity, a sort of theocratic state, that developed in the central heartland of the Igbo region. The Nri had a taboo symbolic code with six types. These included human (such as the birth of twins), animal (such as killing or eating of pythons), object, temporal, behavioral, speech and place taboos. The rules regarding these taboos were used to educate and govern Nri's subjects. This meant that, while certain Igbo may have lived under different formal administrations, all followers of the Igbo religion had to abide by the rules of the faith and obey its representative on earth, the Eze Nri. Decline of Nri kingdom With the decline of Nri kingdom in the 15th to 17th centuries, several states once under their influence, became powerful economic oracular oligarchies and large commercial states that dominated Igboland. The neighboring Awka city-state rose in power as a result of their powerful Agbala oracle and metalworking expertise. The Onitsha Kingdom, which was originally inhabited by Igbos from east of the Niger, was founded in the 16th century by migrants from Anioma (Western Igboland). Later groups like the Igala traders from the hinterland settled in Onitsha in the 18th century. Western Igbo kingdoms like Aboh, dominated trade in the lower Niger area from the 17th century until European penetration. The Umunoha state in the Owerri area used the Igwe ka Ala oracle at their advantage. However, the Cross River Igbo state like the Aro had the greatest influence in Igboland and adjacent areas after the decline of Nri. The Arochukwu kingdom emerged after the Aro-Ibibio Wars from 1630 to 1720, and went on to form the Aro Confederacy which economically dominated Eastern Nigerian hinterland. The source of the Aro Confederacy's economic dominance was based on the judicial oracle of Ibini Ukpabi ("Long Juju") and their military forces which included powerful allies such as Ohafia, Abam, Ezza, and other related neighboring states. The Abiriba and Aro are Brothers whose migration is traced to the Ekpa Kingdom, East of Cross River, their exact take of location was at Ekpa (Mkpa) east of the Cross River. They crossed the river to Urupkam (Usukpam) west of the Cross River and founded two settlements: Ena Uda and Ena Ofia in present-day Erai. Aro and Abiriba cooperated to become a powerful economic force. Igbo gods, like those of the Yoruba, were numerous, but their relationship to one another and human beings was essentially egalitarian, reflecting Igbo society as a whole. A number of oracles and local cults attracted devotees while the central deity, the earth mother and fertility figure Ala, was venerated at shrines throughout Igboland. The weakness of a popular theory that Igbos were stateless rests on the paucity of historical evidence of pre-colonial Igbo society. There is a huge gap between the archaeological finds of Igbo Ukwu, which reveal a rich material culture in the heart of the Igbo region in the 8th century, and the oral traditions of the 20th century. Benin exercised considerable influence on the western Igbo, who adopted many of the political structures familiar to the Yoruba-Benin region, but Asaba and its immediate neighbours, such as Ibusa, Ogwashi-Ukwu, Okpanam, Issele-Azagba and Issele-Ukwu, were much closer to the Kingdom of Nri. Ofega was the queen for the Onitsha Igbo.Igbo imabana Akwa Akpa The modern city of Calabar was founded in 1786 by Efik families who had left Creek Town, farther up the Calabar river, settling on the east bank in a position where they were able to dominate traffic with European vessels that anchored in the river, and soon becoming the most powerful in the region extending from now Calabar down to Bakkasi in the East and Oron Nation in the West. Akwa Akpa (named Calabar by the Spanish) became a center of the Atlantic slave trade, where African slaves were sold in exchange for European manufactured goods. Igbo people formed the majority of enslaved Africans which were sold as slaves from Calabar, despite forming a minority among the ethnic groups in the region. From 1725 until 1750, roughly 17,000 enslaved Africans were sold from Calabar to European slave traders; from 1772 to 1775, the number soared to over 62,000. With the suppression of the slave trade, palm oil and palm kernels became the main exports. The chiefs of Akwa Akpa placed themselves under British protection in 1884. From 1884 until 1906 Old Calabar was the headquarters of the Niger Coast Protectorate, after which Lagos became the main center. Now called Calabar, the city remained an important port shipping ivory, timber, beeswax, and palm produce until 1916, when the railway terminus was opened at Port Harcourt, 145 km to the west. A British sphere of influence Following the Napoleonic Wars, the British expanded trade with the Nigerian interior. In 1885, British claims to a West African sphere of influence received international recognition; and in the following year, the Royal Niger Company was chartered under the leadership of Sir George Taubman Goldie. On the 31st of December 1899 the charter for the Royal Niger Company was revoked by the British government, and the sum of £865,000 was paid to the company as compensation. The entire territory of the Royal Niger Company came into the hands of the British government. On 1 January 1900, the British Empire created the Southern Nigeria Protectorate and the Northern Nigeria Protectorate. In 1914, the area was formally united as the Colony and Protectorate of Nigeria. Administratively, Nigeria remained divided into the Northern and Southern Provinces and Lagos Colony. Western education and the development of a modern economy proceeded more rapidly in the south than in the north, with consequences felt in Nigeria's political life ever since. Following World War II, in response to the growth of Nigerian nationalism and demands for independence, successive constitutions legislated by the British government moved Nigeria toward self-government on a representative and increasingly federal basis. On 1 October 1954, the colony became the autonomous Federation of Nigeria. By the middle of the 20th century, the great wave for independence was sweeping across Africa. On 27 October 1958 Britain agreed that Nigeria would become an independent state on 1 October 1960. Independence The Federation of Nigeria was granted full independence on 1 October 1960 under a constitution that provided for a parliamentary government and a substantial measure of self-government for the country's three regions. From 1959 to 1960, Jaja Wachuku was the First Nigerian Speaker of the Nigerian Parliament, also called the "House of Representatives." Jaja Wachuku replaced Sir Frederick Metcalfe of Britain. Notably, as First Speaker of the House, Jaja Wachuku received Nigeria's Instrument of Independence, also known as Freedom Charter, on 1 October 1960, from Princess Alexandra of Kent, the Queen's representative at the Nigerian independence ceremonies. | faction argued that the Yoruba peoples were losing their pre-eminent position in business in Nigeria to people of the Igbo tribe because the Igbo-dominated NCNC was part of the governing coalition and the AG was not. The federal government Prime Minister, Balewa agreed with the Akintola faction and sought to have the AG join the government. The party leadership under Awolowo disagreed and replaced Akintola as premier of the West with one of their own supporters. However, when the Western Region parliament met to approve this change, Akintola supporters in the parliament started a riot in the chambers of the parliament. Fighting between the members broke out. Chairs were thrown and one member grabbed the parliamentary Mace and wielded it like a weapon to attack the Speaker and other members. Eventually, the police with tear gas were required to quell the riot. In subsequent attempts to reconvene the Western parliament, similar disturbances broke out. Unrest continued in the West and contributed to the Western Region's reputation for, violence, anarchy and rigged elections. Federal Government Prime Minister Balewa declared martial law in the Western Region and arrested Awolowo and other members of his faction charged them with treason. Akintola was appointed to head a coalition government in the Western Region. Thus, the AG was reduced to an opposition role in their own stronghold. First Republic In October 1963 Nigeria proclaimed itself the Federal Republic of Nigeria, and former Governor-General Nnamdi Azikiwe became the country's first President. From the outset, Nigeria's ethnic and religious tensions were magnified by the disparities in economic and educational development between the south and the north. The AG was manoeuvred out of control of the Western Region by the Federal Government and a new pro-government Yoruba party, the Nigerian National Democratic Party (NNDP), took over. Shortly afterwards the AG opposition leader, Chief Obafemi Awolowo, was imprisoned to be without foundation. The 1965 national election produced a major realignment of politics and a disputed result that set the country on the path to civil war. The dominant northern NPC went into a conservative alliance with the new Yoruba NNDP, leaving the Igbo NCNC to coalesce with the remnants of the AG in a progressive alliance. In the vote, widespread electoral fraud was alleged and riots erupted in the Yoruba West where heartlands of the AG discovered they had apparently elected pro-government NNDP representatives. First period of military rule On 15 January 1966 a group of army officers (the Young Majors) mostly south-eastern Igbos, overthrew the NPC-NNDP government and assassinated the prime minister and the premiers of the northern and western regions. However, the bloody nature of the Young Majors coup caused another coup to be carried out by General Johnson Aguiyi-Ironsi. The Young Majors went into hiding. Major Emmanuel Ifeajuna fled to Kwame Nkrumah's Ghana where he was welcomed as a hero. Some of the Young Majors were arrested and detained by the Ironsi government. Among the Igbo people of the Eastern Region, these detainees were heroes. In the Northern Region, however, the Hausa and Fulani people demanded that the detainees be placed on trial for murder. The federal military government that assumed power under General Johnson Aguiyi-Ironsi was unable to quiet ethnic tensions on the issue or other issues. Additionally, the Ironsi government was unable to produce a constitution acceptable to all sections of the country. Most fateful for the Ironsi government was the decision to issue Decree No. 34 which sought to unify the nation. Decree No. 34 sought to do away with the whole federal structure under which the Nigerian government had been organised since independence. Rioting broke out in the North. The Ironsi government's efforts to abolish the federal structure and the renaming the country the Republic of Nigeria on 24 May 1966 raised tensions and led to another coup by largely northern officers in July 1966, which established the leadership of Major General Yakubu Gowon. The name Federal Republic of Nigeria was restored on 31 August 1966. However, the subsequent massacre of thousands of Ibo in the north prompted hundreds of thousands of them to return to the south-east where increasingly strong Igbo secessionist sentiment emerged. In a move towards greater autonomy to minority ethnic groups, the military divided the four regions into 12 states. However, the Igbo rejected attempts at constitutional revisions and insisted on full autonomy for the east. The Central Intelligence Agency commented in October 1966 in a CIA Intelligence Memorandum that: "Africa's most populous country (population estimated at 48 million) is in the throes of a highly complex internal crisis rooted in its artificial origin as a British dependency containing over 250 diverse and often antagonistic tribal groups. The present crisis started" with Nigerian independence in 1960, but the federated parliament hid "serious internal strains. It has been in an acute stage since last January when a military coup d'état destroyed the constitutional regime bequeathed by the British and upset the underlying tribal and regional power relationships. At stake now are the most fundamental questions which can be raised about a country, beginning with whether it will survive as a single viable entity. The situation is uncertain, with Nigeria, ..is sliding downhill faster and faster, with less and less chance unity and stability. Unless present army leaders and contending tribal elements soon reach agreement on a new basis for the association and take some effective measures to halt a seriously deteriorating security situation, there will be increasing internal turmoil, possibly including civil war. On 29 May 1967, Lt. Col. Emeka Ojukwu, the military governor of the eastern region who emerged as the leader of increasing Igbo secessionist sentiment, declared the independence of the eastern region as the Republic of Biafra on 30 May 1967. The ensuing Nigerian Civil War resulted in an estimated 3.5 million deaths (mostly from starving children) before the war ended with Gowon's famous "No victor, no vanquished" speech in 1970. Following the civil war, the country turned to the task of economic development. The U.S. intelligence community concluded in November 1970 that "...The Nigerian Civil War ended with relatively little rancour. The Igbos were accepted as fellow citizens in many parts of Nigeria, but not in some areas of former Biafra where they were once dominant. Iboland is an overpopulated, economically depressed area where massive unemployment is likely to continue for many years. The U.S. analysts said that "...Nigeria is still very much a tribal society..." where local and tribal alliances count more than "national attachment. General Yakubu Gowon, head of the Federal Military Government (FMG) is the accepted national leader and his popularity has grown since the end of the war. The FMG is neither very efficient nor dynamic, but the recent announcement that it intends to retain power for six more years has generated little opposition so far. The Nigerian Army, vastly expanded during the war, is both the main support to the FMG and the chief threat to it. The troops are poorly trained and disciplined and some of the officers are turning to conspiracies and plotting. We think Gowon will have great difficulty in staying in office through the period which he said is necessary before the turnover of power to civilians. His sudden removal would dim the prospects for Nigerian stability." "Nigeria's economy came through the war in better shape than expected." Problems exist with inflation, internal debt, and a huge military budget, competing with popular demands for government services. "The petroleum industry is expanding faster than expected and oil revenues will help defray military and social service expenditures... "Nigeria emerged from the war with a heightened sense of national pride mixed with an anti-foreign sentiment, and an intention to play a larger role in African and world affairs." British cultural influence is strong but its political influence is declining. The Soviet Union benefits from Nigerian appreciation of its help during the war, but is not trying for control. Nigerian relations with the US, cool during the war, are improving, but France may be seen as the future patron. "Nigeria is likely to take a more active role in funding liberation movements in southern Africa." Lagos, however, is not perceived as the "spiritual and bureaucratic capital of Africa"; Addis Ababa has that role...." Foreign exchange earnings and government revenues increased spectacularly with the oil price rises of 1973–74. On July 29, 1975, Gen. Murtala Mohammed and a group of officers staged a bloodless coup, accusing Gen. Yakubu Gowon of corruption and delaying the promised return to civilian rule. General Mohammed replaced thousands of civil servants and announced a timetable for the resumption of civilian rule by 1 October 1979. He was assassinated on 13 February 1976 in an abortive coup and his chief of staff Lt. Gen. Olusegun Obasanjo became head of state. Second Republic A constituent assembly was elected in 1977 to draft a new constitution, which was published on 21 September 1978, when the ban on political activity was lifted. In 1979, five political parties competed in a series of elections in which Alhaji Shehu Shagari of the National Party of Nigeria (NPN) was elected president. All five parties won representation in the National Assembly. During the 1950s prior to independence, oil was discovered off the coast of Nigeria. Almost immediately, the revenues from oil began to make Nigeria a wealthy nation. However, the spike in oil prices from $3 per barrel to $12 per barrel, following the Yom Kippur War in 1973 brought a sudden rush of money to Nigeria. Another sudden rise in the price of oil in 1979 to $19 per barrel occurred as a result of the lead up to the Iran–Iraq War. All of this meant that by 1979, Nigeria was the sixth largest producer of oil in the world with revenues from oil of $24 billion per year. In 1982 the ruling National Party of Nigeria, a conservative alliance led by Shegu Shagari, had hoped to retain power through patronage and control over the Federal Election Commission. In August 1983, Shagari and the NPN were returned to power in a landslide with a majority of seats in the National Assembly and control of 12 state governments. But the elections were marred by violence and allegations of widespread voter fraud included missing returns, polling places failing to open, and obvious rigging of results. There was a fierce legal battle over the results, with the legitimacy of the victory at stake. On December 31, 1983, the military overthrew the Second Republic. Major General Muhammadu Buhari emerged as the leader of the Supreme Military Council (SMC), the country's new ruling body. The Buhari government was peacefully overthrown by the SMC's third-ranking member General Ibrahim Babangida in August 1985. Babangida (IBB) cited the misuse of power, violations of human rights by key officers of the SMC, and the government's failure to deal with the country's deepening economic crisis as justifications for the takeover. During his first days in office, President Babangida moved to restore freedom of the press and to release political detainees being held without charge. As part of a 15-month economic emergency plan, he announced pay cuts for the military, police, civil servants and the private sector. President Babangida demonstrated his intent to encourage public participation in decision making by opening a national debate on proposed economic reform and recovery measures. The public response convinced Babangida of intense opposition to an economic recession. The Abortive Third Republic Head of State Babangida promised to return the country to civilian rule by 1990 which was later extended until January 1993. In early 1989 a constituent assembly completed a constitution and in the spring of 1989 political activity was again permitted. In October 1989 the government established two parties, the National Republican Convention (NRC) and the Social Democratic Party (SDP); other parties were not allowed to register. In April 1990 mid-level officers attempted unsuccessfully to overthrow the government and 69 accused plotters were executed after secret trials before military tribunals. In December 1990 the first stage of partisan elections was held at the local government level. Despite the low turnout, there was no violence and both parties demonstrated strength in all regions of the country, with the SDP winning control of a majority of local government councils. In December 1991 state legislative elections were held and Babangida decreed that previously banned politicians could contest in primaries scheduled for August. These were cancelled due to fraud and subsequent primaries scheduled for September also were cancelled. All announced candidates were disqualified from standing for president once a new election format was selected. The presidential election was finally held on 12 June 1993, with the inauguration of the new president scheduled to take place 27 August 1993, the eighth anniversary of President Babangida's coming to power. In the historic 12 June 1993 presidential elections, which most observers deemed to be Nigeria's fairest, early returns indicated that wealthy Yoruba businessman M. K. O. Abiola won a decisive victory. However, on 23 June, Babangida, using several pending lawsuits as a pretence, annulled the election, throwing Nigeria into turmoil. More than 100 were killed in riots before Babangida agreed to hand power to an interim government on 27 August 1993. He later attempted to renege on this decision, but without popular and military support, he was forced to hand over to Ernest Shonekan, a prominent nonpartisan businessman. Shonekan was to rule until elections scheduled for February 1994. Although he had led Babangida's Transitional Council since 1993, Shonekan was unable to reverse Nigeria's economic problems or to defuse lingering political tension. Sani Abacha With the country sliding into chaos Defense Minister Sani Abacha assumed power and forced Shonekan's resignation on 17 November 1993. Abacha dissolved all democratic institutions and replaced elected governors with military officers. Although promising restoration of civilian rule he refused to announce a transitional timetable until 1995. Following the annulment of the June 12 election, the United States and others imposed sanctions on Nigeria including travel restrictions on government officials and suspension of arms sales and military assistance. Additional sanctions were imposed as a result of Nigeria's failure to gain full certification for its counter-narcotics efforts. Although Abacha was initially welcomed by many Nigerians, disenchantment grew rapidly. Opposition leaders formed the National Democratic Coalition (NADECO), which campaigned to reconvene the Senate and other disbanded democratic institutions. On 11 June 1994 Moshood Kashimawo Olawale Abiola declared himself president and went into hiding until his arrest on 23 June. In response, petroleum workers called a strike demanding that Abacha release Abiola and hand over power to him. Other unions joined the strike, bringing economic life around Lagos and the southwest to a standstill. After calling off a threatened strike in July the Nigeria Labour Congress (NLC) reconsidered a general strike in August after the government imposed conditions on Abiola's release. On 17 August 1994, the government dismissed the leadership of the NLC and the petroleum unions, placed the unions under appointed administrators, and arrested Frank Kokori and other labor leaders. The government alleged in early 1995 that military officers and civilians were engaged in a coup plot. Security officers rounded up the accused, including former Head of State Obasanjo and his deputy, retired General Shehu Musa Yar'Adua. After a secret tribunal, most of the accused were convicted and several death sentences were handed down. In 1994 the government set up the Ogoni Civil Disturbances Special Tribunal to try Ogoni activist Ken Saro-Wiwa and others for their alleged roles in the killings of four Ogoni politicians. The tribunal sentenced Saro-Wiwa and eight others to death and they were executed on 10 November 1995. On 1 October 1995 Abacha announced the timetable for a three-year transition to civilian rule. Only five political parties were approved by the regime and voter turnout for local elections in December 1997 was under 10%. On 20 December 1997, the government arrested General Oladipo Diya, ten officers, and eight civilians on charges of coup plotting. The accused were tried before a Gen Victor Malu military tribunal in which Diya and five others- Late Gen AK Adisa, Gen Tajudeen Olnrewaju, Late Col OO Akiyode, Major Seun Fadipe and a civilian Engr Bola Adebanjo were sentenced to death to die by firing squad. Abacha enforced authority through the federal security system which is accused of numerous human rights abuses, including infringements on freedom of speech, assembly, association, travel, and violence against women. Abubakar's transition to civilian rule Abacha died of heart failure on 8 June 1998 and was replaced by General Abdulsalami Abubakar. The military Provisional Ruling Council (PRC) under Abubakar commuted the sentences of those accused in the alleged coup during the Abacha regime and released almost all known civilian political detainees. Pending the promulgation of the constitution written in 1995, the government observed some provisions of the 1979 and 1989 constitutions. Neither Abacha nor Abubakar lifted the decree suspending the 1979 constitution, and the 1989 constitution was not implemented. The judiciary system continued to be hampered by corruption and lack of resources after Abacha's death. In an attempt to alleviate such problems Abubakar's government implemented a civil service pay raise and other reforms. In August 1998 Abubakar appointed the Independent National Electoral Commission (INEC) to conduct elections for local government councils, state legislatures and governors, the national assembly, and president. The NEC successfully held elections on 5 December 1998, 9 January 1999, 20 February, and 27 February 1999, respectively. For local elections, nine parties were granted provisional registration with three fulfilling the requirements to contest the following elections. These parties were the People's Democratic Party (PDP), the All People's Party (APP), and the predominantly Yoruba Alliance for Democracy (AD). The former military head of state Olusegun Obasanjo, freed from prison by Abubakar, ran as a civilian candidate and won the presidential election. The PRC promulgated a new constitution based largely on the suspended 1979 constitution, before the 29 May 1999 inauguration of the new civilian president. The constitution includes provisions for a bicameral legislature, the National Assembly consisting of a 360-member House of Representatives and a 109-member Senate. Fourth Republic The emergence of democracy in Nigeria on May 1999 ended 16 years of consecutive military rule. Olusegun Obasanjo inherited a country suffering economic stagnation and the deterioration of most democratic institutions. Obasanjo, a former general, was admired for his stand against the Abacha dictatorship, his record of returning the federal government to civilian rule in 1979, and his claim to represent all Nigerians regardless of religion. The new President took over a country that faced many problems, including a dysfunctional bureaucracy, collapsed infrastructure, and a military that wanted a reward for returning quietly to the barracks. The President moved quickly and retired hundreds of military officers holding political positions, established a blue-ribbon panel to investigate human rights violations, released scores of persons held without charge, and rescinded numerous questionable licenses and contracts left by the previous regimes. The government also moved to recover millions of dollars in funds secreted to overseas accounts. Most civil society leaders and Nigerians witnessed marked improvements in human rights and freedom of the press under Obasanjo. As Nigeria works out representational democracy, conflicts persist between the Executive and Legislative branches over appropriations and other proposed legislation. A sign of federalism has been the growing visibility of state governors and the inherent friction between Abuja and the state capitals over resource allocation. Communal violence has plagued the Obasanjo government since its inception. In May 1999 violence erupted in Kaduna State over the succession of an Emir resulting in more than 100 deaths. In November 1999, the army destroyed the town of Odi, Bayelsa State and killed scores of civilians in retaliation for the murder of 12 policemen by a local gang. In Kaduna in February–May 2000 over 1,000 people died in rioting over the introduction of criminal Shar'ia in the State. Hundreds of ethnic Hausa were killed in reprisal attacks in south-eastern Nigeria. In September 2001, over 2,000 people were killed in inter-religious rioting in Jos. In October 2001, hundreds were killed and thousands displaced in communal violence that spread across the states of Benue, Taraba, and Nasarawa. On 1 October 2001 Obasanjo announced the formation of a National Security Commission to address the issue of communal violence. Obasanjo was reelected in 2003. The new president faces the daunting task of rebuilding a petroleum-based economy, whose revenues have been squandered through corruption and mismanagement. Additionally, the Obasanjo administration must defuse longstanding ethnic and religious tensions if it hopes to build a foundation for economic growth and political stability. Currently, there is conflict in the Niger Delta over the environmental destruction caused by oil drilling and the ongoing poverty in the oil-rich region. A further major problem created by the oil industry is the drilling of pipelines by the local population in an attempt to drain off the petroleum for personal use or as a source of income. This often leads to major explosions and high death tolls. Particularly notable disasters in this area have been: 1) October 1998, Jesse, 1100 deaths, 2) July 2000, Jesse, 250 deaths, 3) September 2004, near Lagos, 60 deaths, 4) May 2006, Ilado, approx. 150–200 deaths (current estimate). Two militants of an unknown faction shot and killed Ustaz Ja'afar Adam, a northern Muslim religious leader and Kano State official, along with one of his disciples in a mosque in Kano during dawn prayers on 13 April 2007. Obasanjo had recently stated on national radio that he would "deal firmly" with election fraud and violence advocated by "highly placed individuals." His comments were interpreted by some analysts as a warning to his Vice President and 2007 presidential candidate Atiku Abubakar. In the 2007 general election, Umaru Yar'Adua and Goodluck Jonathan, both of the People's Democratic Party, were elected President and Vice President, respectively. The election was marred by electoral fraud, and denounced by other candidates and international observers. Yar'Adua's sickness and Jonathan's successions Yar'Adua's presidency was fraught with uncertainty as media reports said he suffered from kidney and heart disease. In November 2009, he fell ill and was flown out of the country to Saudi Arabia for medical attention. He remained incommunicado for 50 days, by which time rumours were rife that he had died. This continued until the BBC aired an interview that was allegedly done via telephone from the president's sick bed in Saudi Arabia. As of January 2010, he was still abroad. In February |
transportation, but its presence brings some relief to farmers since the low humidity present in the air quickens the drying of their crops. Temperature Nigeria's seasons and temperature variance are determined by rainfall with rainy season and dry season being the major seasons in Nigeria. The rainy season of Nigeria brings in cooler weather to the country as a result of an increased cloud cover that acts as a blockage of the intense sunshine of the tropics by blocking much of the suns rays in the rainy season; this in turn cools the land, and the winds above the ground remain cool thereby making for cooler temperatures during the rainy season. But afternoons in the rainy season can be hot and humid. In the rainy season it is damp, and the rainfalls are usually abundant. The dry season of Nigeria is a period of little cloud cover in the southern part of Nigeria to virtually no cloud cover in the northern part of Nigeria. The sun shines through the atmosphere with little obstructions from the clear skies making the dry season in Nigeria a period of warm weather conditions. In the middle of the dry season around December, the dust brought in by the Harmattan partially blocks the sun's rays, which lowers temperatures. But with the withdrawal of this wind around March to April following the onset of the rainy season, temperatures can go as high as in some parts of Nigeria. Semi-temperate weather conditions prevail on the highlands in central Nigeria above above sea level, namely the Jos Plateau. Temperatures on the Jos plateau ranges between 16 °C to 25 °C which are cool all year round. Temperate weather conditions occur on the highlands along the Nigeria Cameroon border, in the eastern part of Nigeria. Highlands in these region attain an average height of more than to some standing above above sea level. The climate on these highlands is temperate all year round. The major highlands in this region are the Obudu Plateau above , Mambilla Plateau above and Mt. Chappal Waddi above . Topography Nigeria's most expansive topographical region is that of the Niger and Benue River valleys, which merge into each other and form a "y" shaped confluence at Lokoja. Plains rise to the north of the valleys. To the southwest of the Niger there is "rugged" highland, and to the southeast of the Benue hills and mountains are found all the way to the border with Cameroon. Coastal plains are found in both the southwest and the southeast. Niger Delta The Niger Delta is located in the southern part of Nigeria. It is one of the world's largest arcuate fan-shaped river deltas. The riverine area of the Niger Delta is a coastal belt of swamps bordering the Atlantic. The mangrove swamps are vegetated tidal flats formed by a reticulate pattern of interconnected meandering creeks and tributaries of the Niger River. About 70% of Nigeria's crude oil and gas production is from the area. A recent global remote sensing analysis suggested that there were 1,244km² of tidal flats in Nigeria, making it the 27th ranked country in terms of tidal flat area. Vegetation Nigeria is covered by three types of vegetation: forests (where there is significant tree cover), savannahs (insignificant tree cover, with grasses and flowers located between trees), and montane land (least common and mainly found in the mountains near the Cameroon border. Both the forest zone and the savannah zone are divided into three parts. Some of the forest zone's most southerly portion, especially around the Niger River and Cross River deltas, is mangrove swamp. North of this is fresh water swamp, containing different vegetation from the salt water mangrove swamps, and north of that is rain forest. The savannah zone's three categories are divided into Guinean forest-savanna mosaic, made up of plains of tall grass which are interrupted by trees, the most common across the country; Sudan savannah, with short grasses and short trees; and Sahel savannah patches of grass and sand, found in the northeast. Natural resources and land use Nigeria's natural resources include but are not limited to petroleum, tin, columbite, iron ore, coal, limestone, lead, zinc, natural gas, hydropower, and arable land. Extreme points This is a list of the extreme points of Nigeria, the points that are farther north, south, east or west than any other location. Northernmost point – unnamed location on the border with Niger immediately northwest of the town | with the Harmattan wind, a continental tropical (CT) air mass laden with dust from the Sahara, prevailing throughout this period. With the Intertropical Convergence Zone swinging northward over West Africa from the Southern Hemisphere in April, heavy showers coming from pre-monsoonal convective clouds mainly in the form of squall lines also known as the north easterlies formed mainly as a result of the interactions of the two dominant airmasses in Nigeria known as the maritime tropical (south westerlies) and the continental tropical (north easterlies), begins in central Nigeria while monsoons arrive in July, bringing with it high humidity, heavy cloud cover and heavy rainfall lasting till September when the monsoons gradually begin retreating southward to the southern part of Nigeria. Rainfall totals in central Nigeria varies from in the lowlands to over along the south western escarpment of the Jos Plateau. A hot semi-arid climate (BSh) is predominant within the Sahel in the northern part of Nigeria. Annual rainfall totals are low. The rainy season in the northern part of Nigeria lasts for three to four months (June–September). The rest of the year is hot and dry with temperatures climbing as high as . Potiskum, Yobe State in the northeast of Nigeria recorded Nigeria’s lowest ever temperature of . Alpine climate or highland climate or mountain climate are found on highlands regions in Nigeria. Highlands with the alpine climate in Nigeria, are well over above sea level. Because of the location in the tropics, this elevation is high enough to reach the temperate climate line in the tropics thereby giving the highlands, mountains and the plateau regions standing above this height, a cool mountain climate. Rainfall Rainfall in the coastal belt of the Niger Delta is heavy due to the closeness of the Delta region to the equator. Annual rainfall totals vary from 2,400 to over 4,000 millimeters. Niger Delta cities and their annually rainfall totals in millimeters: Warri — 2,730 mm Forcados (coastal town in the Niger Delta) — 4,870 mm Port Harcourt — 2,400 mm Calabar (coastal city) — 3,070 mm (rainiest city with over one million people in Nigeria) Bonny (south of Port Harcourt) — 4,200 mm Trade winds Tropical maritime air mass The tropical maritime air mass (MT) is responsible for Nigeria's rainy season. This wind begins in February in the southern part of Nigeria while it takes longer for the wind to fully cover the whole of the country, reaching the northern part of Nigeria in June. Its presence a result of the northward retreat of the Harmattan. The northward retreat of the tropical continental air mass (CT) is caused by the sun's northward shift from the tropic of capricorn in the southern hemisphere to the tropic of cancer in the northern hemisphere. This shift begins from February and ends in June, when the sun is fully overhead at the tropic of cancer. During this northward migration of the sun as a result of the earth tilting along its axis, the sun crosses the equator (around March), moving over West Africa. West Africa comes directly under the sun at this time. The whole of West Africa is heated intensely as result of the increased insolation. Temperatures can climb as high as over West Africa during this time. Temperatures in the northern part of Nigeria can go as high as in cities like Maiduguri. The high temperatures coupled with an increase in insolation cause a region of low pressure to develop over West Africa and Nigeria between March to May. The tropical continental air mass from the Sahara is weakened by the overheating of the land surface. This transfer of heat in turn causes the atmosphere to expand and become lighter. The air mass loses its strength around February in the southern part of Nigeria to June in northern Nigeria and begins to retreat coupled with the rising of air in form of convection within this air mass, further weakening the dominance of the wind over West Africa. The air mass finally retreats from most of Nigeria around April to May. The sun's rays enters into the atmosphere of Nigeria more intense than it does during the Harmattan, which contains dust and haze. The heating of the West Africa land mass creates a low pressure region over West Africa. This low pressure zone aid in the development of the tropical maritime air mass from the south Atlantic. The tropical maritime air mass is a warm, humid and unstable trade wind. Convection currents form within the airmass whenever there is little instability in the airmass as a result of a slight to a very high orographic uplift in mountainous regions like the Obudu Plateau or the heating of the land which can trigger the formation of cumulonimbus cloud leading to thunderstorms within the air mass. During the dominance of the tropical maritime air mass, mornings are bright and sunny, the sun's heating of the land in the mornings and afternoons sets up convectional currents, these currents rise vertically, cumulonimbus clouds are formed, and torrential downpours can occur in the afternoon. The African easterly wave is another major contributor of rainfall during the summer monsoons months of May to |
making it the most populated region outside of South-Central Asia. According to the United Nations, the population of Nigeria will reach 411 million by 2050. Nigeria might then be the 3rd most populous country in the world. In 2100, the population of Nigeria may reach 794 million. While the overall population is expected to increase, the growth rate is estimated to decrease from 1.2 percent per year in 2010 to 0.4 percent per year in 2050. The birth rate is also projected to decrease from 20.7 to 13.7, while the death rate is projected to increase from 8.5 in 2010 to 9.8 in 2050. Life expectancy is all expected to increase from 67.0 years in 2010 to 75.2 years in 2050. By 2050, 69.6% of the population is estimated to be living in urban areas compared to 50.6% in 2010. Vital statistics Registration of vital events in Nigeria is not complete. The Population Department of the United Nations prepared the following estimates. Life expectancy at birth Life expectancy from 1950 to 2015 (UN World Population Prospects): Other demographic statistics The following demographic statistics of Nigeria in 2019 are from the World Population Review. One birth every 4 seconds One death every 14 seconds One net migrant every 9 minutes Net gain of one person every 6 seconds The following demographic statistics are from the CIA World Factbook, unless otherwise indicated. Population 203,452,505 (July 2018 est.) 178.5 million (2014 est.) 174,507,539 (July 2013 est.) Population distribution Nigeria is Africa's most populous country. Significant population clusters are scattered throughout the country, with the highest density areas being in the south and southwest. Age structure 2018 est. 0-14 years: 42.45% (male 44,087,799 /female 42,278,742) 15-24 years: 19.81% (male 20,452,045 /female 19,861,371) 25-54 years: 30.44% (male 31,031,253 /female 30,893,168) 55-64 years: 4.04% (male 4,017,658 /female 4,197,739) 65 years and over: 3.26% (male 3,138,206 /female 3,494,524) 2017 est. 0–14 years: 42.5% (male 41,506,288/female 39,595,720) 15–24 years: 19.6% (male 19,094,899/female 18,289,513) 25–54 years: 30.7% (male 30,066,196/female 28,537,846) 55–64 years: 3.9% (male 3,699,947/female 3,870,080) 65 years and over: 3% (male 2,825,134/female 3,146,638) 2013 est. 0–14 years: 43.8% (male 39,127,615/female 37,334,281) 15–24 years: 19.3% (male 17,201,067/female 16,451,357) 25–54 years: 30.1% (male 25,842,967/female 26,699,432) 55–64 years: 3.8% (male 3,016,896/female 3,603,048) 65 years and over: 3% (male 2,390,154/female 2,840,722) Birth rate 35.2 births/1,000 population (2018 est.) Country comparison to the world: 20th 36.9 births/1,000 population (2017 est.) 38.78 births/1,000 population (2013 est.) Death rate 9.6 deaths/1,000 population (2018 est.) Country comparison to the world: 46th 12.4 deaths/1,000 population (2017 est.) 13.2 deaths/1,000 population (2013 est.) Total fertility rate 4.85 children born/woman (2018 est.) Country comparison to the world: 16th 5.07 children born/woman (2017 est.) Population growth rate 2.54% (2018 est.) Country comparison to the world: 21st 2.43% (2017 est.) Country comparison to the world: 24th 2.54% (2013 est.) Mother's mean age at first birth 20.3 years Note: median age at first birth among women 25–29 (2013 est.) Contraceptive prevalence rate 13.4% (2016/17) Net migration rate -0.2 migrant(s)/1,000 population (2017 est.) Country comparison to the world: 106th -0.22 migrant(s)/1,000 population (2013 est.) Dependency ratios total dependency ratio: 88.2 youth dependency ratio: 83 potential support ratio: 19.4 (2015 est.) Urbanization urban population: 50.3% of total population (2018) rate of urbanization: 4.2% annual rate of change (2015–20 est.) Life expectancy at birth total population: 59.3 years (2018 est.) male: 57.5 years (2018 est.) female: 61.1 years (2018 est.) total population: 52.05 years male: 48.95 years female: 55.33 years (2012 est.) total population: 46.94 yearsmale: 46.16 yearsfemale: 47.76 years (2009 est.)total population: 51.56 yearsmale: 51.58 yearsfemale: 51.55 years (2000 est.) HIV/AIDS adult prevalence rate 2.8% (2017 est.) people living with HIV/AIDS 2.6 million (2007 est.) 3.3 million (2009 est.) School life expectancy (primary to tertiary education)total: 15 yearsmale: 14 yearsfemale: 15 years (2011) Literacy definition: age 15 and over can read and writetotal population: 59.6%male: 69.2%female: 49.7% (2015 est.)Total population: 78.6%Male: 84.35%Female:'' 72.65% (2010 est.) Unemployment, youth ages 15–24 total: 12.4% (2016 est.) male: NA (2016 est.) female: NA (2016 est.) Total and percent distribution of population by single year of age (Census 2006) Structure of the population (DHS 2013) (males 87 034, females 89 529 = 176 574): Emigration Today millions of ethnic Nigerians live abroad, the largest communities can be found in the United Kingdom (500,000–3,000,000) and the United States (600,000–1,000,000 Nigerians), other countries that followed closely are South Africa, Gambia, and Canada respectively. There are between 90,000 and 100,000 Nigerians in Brazil, many of them living illegally without proper documentation. Additionally, there were around 100,000 Nigerians living in China in 2012, mostly in the city of Guangzhou, but have since declined to about 10,000 due to strict immigration enforcement by Chinese officials as many of them were known for engaging in illegal activities. There are also large groups in Ireland, Portugal and many other countries. Inspiration for emigration is based heavily on socio-economical issues such as warfare, insecurity, economical instability and civil unrest. Between 1400–1900, of 1.4 million of 2 million emigrants were slaves sent to the Americas. This is due to the fact that the land now known as Nigeria was a central point for 4 slave trades during the 19th century. Though bondage represented a great deal, an estimated 30,000 Nigerian inhabitants would relocate to Kano City and Gambia to take advantage of financial opportunities afforded by fertile land and available natural resources. What's more, the presence of gold mines and rail lines along the Gold Coast, present-day Ghana, attracted an estimated 6,500 Nigerian citizens to attain financial gain and opportunity. The population of Nigerians in Ghana rose to roughly 149,000 before the 1969 alien expulsion order would displace nearly the entire population to surrounding countries. Religion Nigeria is nearly equally divided between Islam and Christianity. The majority of Nigerian Muslims are Sunni and are concentrated in the northern, central and south-western zones of the country, while Christians dominate in some central states (especially Plateau and Benue states), and the south-east and south-south regions. Other religions practiced in Nigeria include African Traditional Religion, Hinduism, Baháʼí Faith, Judaism, The Grail Movement, and the Reformed Ògbóni Fraternity, one of the traditional socio-religious institutions of the Yorùbá people and their Òrìṣà religion known as Ẹ̀sìn Òrìṣà Ìbílẹ̀ in the Yorùbá language. According to a 2009 Pew survey, 50.4% of Nigeria's population were Muslims. A later Pew study in 2011 calculated that Christians now formed 50.8% of the population. Adherents of other religions 1% make up of the population. The shift of population balance between Muslims and Christians is a result of northern and southern Nigeria being in different stages of demographic transition. | rate of more than 6.5% without commensurate increases in social amenities and infrastructure." He also stated that the population "grew substantially from 17.3% in 1967 to 49.4% in 2017." Fertility and births Total Fertility Rate (TFR) (Wanted TFR) and Crude Birth Rate (CBR): Fertility data as of 2013 (DHS Program): Source: Demographic and Health Surveys (DHS) Fertility rate by state ∗ MICS surveys Contraceptive prevalence Contraceptive prevalence, any methods (% of women ages 15–49) ∗ UNICEFs State of the Worlds Children and Child info, United Nations Population Divisions World Contraceptive Use, household surveys including Demographic and Health Surveys and Multiple Indicator Cluster Surveys. Population projections The total population in sub-Saharan Africa is projected to increase to almost one billion people, making it the most populated region outside of South-Central Asia. According to the United Nations, the population of Nigeria will reach 411 million by 2050. Nigeria might then be the 3rd most populous country in the world. In 2100, the population of Nigeria may reach 794 million. While the overall population is expected to increase, the growth rate is estimated to decrease from 1.2 percent per year in 2010 to 0.4 percent per year in 2050. The birth rate is also projected to decrease from 20.7 to 13.7, while the death rate is projected to increase from 8.5 in 2010 to 9.8 in 2050. Life expectancy is all expected to increase from 67.0 years in 2010 to 75.2 years in 2050. By 2050, 69.6% of the population is estimated to be living in urban areas compared to 50.6% in 2010. Vital statistics Registration of vital events in Nigeria is not complete. The Population Department of the United Nations prepared the following estimates. Life expectancy at birth Life expectancy from 1950 to 2015 (UN World Population Prospects): Other demographic statistics The following demographic statistics of Nigeria in 2019 are from the World Population Review. One birth every 4 seconds One death every 14 seconds One net migrant every 9 minutes Net gain of one person every 6 seconds The following demographic statistics are from the CIA World Factbook, unless otherwise indicated. Population 203,452,505 (July 2018 est.) 178.5 million (2014 est.) 174,507,539 (July 2013 est.) Population distribution Nigeria is Africa's most populous country. Significant population clusters are scattered throughout the country, with the highest density areas being in the south and southwest. Age structure 2018 est. 0-14 years: 42.45% (male 44,087,799 /female 42,278,742) 15-24 years: 19.81% (male 20,452,045 /female 19,861,371) 25-54 years: 30.44% (male 31,031,253 /female 30,893,168) 55-64 years: 4.04% (male 4,017,658 /female 4,197,739) 65 years and over: 3.26% (male 3,138,206 /female 3,494,524) 2017 est. 0–14 years: 42.5% (male 41,506,288/female 39,595,720) 15–24 years: 19.6% (male 19,094,899/female 18,289,513) 25–54 years: 30.7% (male 30,066,196/female 28,537,846) 55–64 years: 3.9% (male 3,699,947/female 3,870,080) 65 years and over: 3% (male 2,825,134/female 3,146,638) 2013 est. 0–14 years: 43.8% (male 39,127,615/female 37,334,281) 15–24 years: 19.3% (male 17,201,067/female 16,451,357) 25–54 years: 30.1% (male 25,842,967/female 26,699,432) 55–64 years: 3.8% (male 3,016,896/female 3,603,048) 65 years and over: 3% (male 2,390,154/female 2,840,722) Birth rate 35.2 births/1,000 population (2018 est.) Country comparison to the world: 20th 36.9 births/1,000 population (2017 est.) 38.78 births/1,000 population (2013 est.) Death rate 9.6 deaths/1,000 population (2018 est.) Country comparison to the world: 46th 12.4 deaths/1,000 population (2017 est.) 13.2 deaths/1,000 population (2013 est.) Total fertility rate 4.85 children born/woman (2018 est.) Country comparison to the world: 16th 5.07 children born/woman (2017 est.) Population growth rate 2.54% (2018 est.) Country comparison to the world: 21st 2.43% (2017 est.) Country comparison to the world: 24th 2.54% (2013 est.) Mother's mean age at first birth 20.3 years Note: median age at first birth among women 25–29 (2013 est.) Contraceptive prevalence rate 13.4% (2016/17) Net migration rate -0.2 migrant(s)/1,000 population (2017 est.) Country comparison to the world: 106th -0.22 migrant(s)/1,000 population (2013 est.) Dependency ratios total dependency ratio: 88.2 youth dependency ratio: 83 potential support ratio: 19.4 (2015 est.) Urbanization urban population: 50.3% of total population (2018) rate of urbanization: 4.2% annual rate of change (2015–20 est.) Life expectancy at birth total population: 59.3 years (2018 est.) male: 57.5 years (2018 est.) female: 61.1 years (2018 est.) total population: 52.05 years male: 48.95 years female: 55.33 years (2012 est.) total population: 46.94 yearsmale: 46.16 yearsfemale: 47.76 years (2009 est.)total population: 51.56 yearsmale: 51.58 yearsfemale: 51.55 years (2000 est.) HIV/AIDS adult prevalence rate 2.8% (2017 est.) people living with HIV/AIDS 2.6 million (2007 est.) 3.3 million (2009 est.) School life expectancy (primary to tertiary education)total: 15 yearsmale: 14 yearsfemale: 15 years (2011) Literacy definition: age 15 and over can read and writetotal population: 59.6%male: 69.2%female: 49.7% (2015 est.)Total population: 78.6%Male: 84.35%Female:'' 72.65% (2010 est.) Unemployment, youth ages 15–24 total: 12.4% (2016 est.) male: NA (2016 est.) female: NA (2016 est.) Total and percent distribution of population by single year of age (Census 2006) Structure of the population (DHS 2013) (males 87 034, females 89 529 = 176 574): Emigration Today millions of ethnic Nigerians live abroad, the largest communities can be found in the United Kingdom (500,000–3,000,000) and the United States (600,000–1,000,000 Nigerians), other countries that followed closely are South Africa, Gambia, and Canada respectively. There are between 90,000 and 100,000 Nigerians in Brazil, many of them living illegally without proper documentation. Additionally, there were around 100,000 Nigerians living in China in 2012, mostly in the city of Guangzhou, but have since declined to about 10,000 due to strict immigration enforcement by Chinese officials as many of them were known for engaging in illegal activities. There are also large groups in Ireland, Portugal and many other countries. Inspiration for emigration is based heavily on socio-economical issues such as warfare, insecurity, economical instability and civil unrest. Between 1400–1900, of 1.4 million of 2 million emigrants were slaves sent to the Americas. This is due to the fact that the land now known as Nigeria was a central point for 4 slave trades during the 19th century. Though bondage represented a great deal, an estimated 30,000 Nigerian inhabitants would relocate to Kano City and Gambia to take advantage of financial opportunities afforded by fertile land and available natural resources. What's more, the presence of gold mines and rail lines along the Gold Coast, present-day Ghana, attracted an estimated 6,500 Nigerian citizens to attain financial gain and opportunity. The population of Nigerians in Ghana rose to roughly 149,000 before the 1969 alien expulsion order would displace nearly the entire population to surrounding countries. Religion Nigeria is nearly equally divided between Islam and Christianity. The majority of Nigerian Muslims are Sunni and are concentrated in the northern, central and south-western zones of the country, while Christians dominate in some central states (especially Plateau and Benue states), and the south-east and south-south regions. Other religions practiced in Nigeria include African Traditional Religion, Hinduism, Baháʼí Faith, Judaism, The Grail Movement, and the Reformed Ògbóni Fraternity, one of the traditional socio-religious institutions of the Yorùbá people and |
parties are: Accord, Action Alliance, Action Democratic Party, Action Peoples Party, African Action Congress, African Democratic Congress, All Progressives Congress, All Progressives Grand Alliance, Allied Peoples Movement, Boot Party, Labour Party, National Rescue Movement, New Nigeria Peoples Party, Peoples Democratic Party, Peoples Redemption Party, Social Democratic Party, Young Progressive Party, Zenith Labour Party. Electoral system and recent elections The president and members of the National Assembly of Nigeria are elected by members of the population who are at least 18 years old. The National Electoral Commission is responsible for monitoring elections and ensuring that the results are correct and not fraudulent. The winner of a position is elected through the first-past-the-post system that is used in Great Britain. Nigeria has faced numerous bouts with fraudulent elections, particularly noteworthy is the general election that took place in 2007. This election was reportedly marred by ballot-rigging, underage voting, violence, intimidation, and an overall absence of clarity and accuracy from the National Electoral Commission. Presidential elections of Nigeria, 2015 House of Representatives Senate Presidential election of Nigeria, 2019 Christian-Muslim relations Islamic Law has found its way into the heart of many Nigerian state governments, particularly in the northern sect of the country. There is a deep rift between Christians and Muslim in Nigeria, and therefore the government has taken on a hybrid of English Common Law and Islamic Law when dealing with legal issues in order to appease the diverse national population. Nigeria has the largest population of Christians and Muslims cohabitating in the world. These two religions were introduced in Nigeria largely during the colonial period, and since then, many Africans have merged their own traditional religions with these two institutionalized ones. Religious tensions between Christians and Muslims in Nigeria has often been used by politicians and other powerful people in order to incite violence and create fear and chaos among Nigerians. This has led to many citizens questioning why Nigeria remains one federal state, and that it should possibly split along the Christian-Muslim divide. The Northern section of the country is largely Islamic, with 12 states that live under Sharia Law, while the Southern area is mostly Christian. There have been multiple attempts by Nigerian Muslims to add Sharia concepts to the Constitution which has alarmed the Christian population within the nation. Many Christians have deemed the rise in Islam in Nigeria to be dangerous and that it could possibly lead to increased terrorism and instability. This conflict is threatening the stability of Nigeria's democracy, internal structure, and civil society, and many political scientists and Nigerian leaders hope the two religions can engage in a peaceful dialogue that hopefully pacifies both sides. Terrorism in Nigeria The greatest terrorist threat in Nigeria is from the organization Boko Haram, and became a prevalent issue in the summer of 2009. Boko Haram is a radical jihadist Islamist terrorist group from the northern sect of Nigeria. This organization has launched terror attacks that have largely targeted the Nigerian federal government, non-Muslim religious organizations, and average citizens. The rise and growing effects of Boko Haram have been attributed to the instability and fragility of the Nigerian state. They are upset by the government corruption and policy failures of Nigeria, and in particular, the poverty and lack of development of the north of Nigeria which is predominantly Muslim. The impact of Boko Haram on Nigeria has been devastating, over 37,000 individuals have died due to their terrorist attacks since 2011, and over 200,000 Nigerians have been displaced. Boko Haram was responsible for the kidnapping of hundreds of school girls in 2014, triggering the #BringBackOurGirls movement across the globe. The terrorist organization became a part of ISIS in 2015, drawing concerns to the safety and stability of Nigeria. Many world powers including the United States have contributed military resources to help fight against Boko Haram because Nigeria's oil industry is crucial to the international economy. The Nigerian federal government has launched programs and tactics to combat Boko Horam because of their prevalence. There has also been a recent rise in citizen-created, and in particular youth-led groups that are taking action against Boko Haram to protect themselves and their communities. Both the actions of Boko Haram and the government's efforts to combat terrorism have led to a growing refugee crisis in Nigeria. Commonwealth membership Nigeria's membership in the British Commonwealth began in 1960 and was suspended from 1995 to 1999 when the country became a state under military rule. It was reinstated in 1999 when democracy was established with the Presidential Constitution and Fourth Republic of Nigeria, and it remains a part of the Commonwealth to this day. The Commonwealth Secretariat aims to help Nigeria detect and deter corruption within its federal government. In 2018, they taught numerous government officials and financial officers how to combat and condemn corruption within the nation. The Secretariat's involvement both in governmental and financial affairs created a better system for the transaction of goods and services in Nigeria with less risk of corruption. As of 2017, the Commonwealth has provided Nigeria with policies and resources for Great Britain's exit from the European Union and outlined the possible effects on Commonwealth nations and trade. The Commonwealth Secretariat has helped Nigeria in its natural resource fields such as oil and mining. They have helped with negotiations and the creation of fair bargains. The Commonwealth Secretariat has also provided Nigeria with access to their Connectivity Agenda, which allows nations under the Commonwealth to communicate and exchange ideas and policies to help each other with economic and domestic productivity. States of Nigeria Nigeria is made up of 36 states and 1 territory. They are: the Federal Capital Territory, Abia, Adamawa, Akwa Ibom, Anambra, Bauchi, Bayelsa, Benue, Borno, Cross River, Delta, Ebonyi, Edo, Ekiti, Enugu, Gombe, Imo, Jigawa, Kaduna, Kano, Katsina, Kebbi, Kogi, Kwara, Lagos, Nasarawa, Niger, Ogun, Ondo, Osun, Oyo, Plateau, Rivers, Sokoto, Taraba, Yobe, and Zamfara. Local Governments Each state is further divided into Local Government Areas (LGAs). These states and their local governments are essential to the function of a federal government because they have a pulse on the local population and can therefore assess the needs of constituents and enact policy or infrastructure that is helpful. They are also important because the federal government has the time and resources to take on national projects and international affairs while local governments can take care of the Nigerians native to their respective states. The devolution of power between the states and the federal government helps the functionality of Nigeria. 774 local governments oversee the collection of local taxes, education, health care, roads, waste, and planning. The local Government look after the affairs of the common men and women in the Nigeria society. The creation of Local Government reform started in 1968, 1970 during the military Government but was fully 1976. Federal Government's handling of COVID-19 As Africa's most populated nation, the coronavirus pandemic has ravaged across Nigeria. Nigeria has proved that can detect, respond to, and prevent the COVID-19 outbreak in a very restricted, poor fashion. Nigeria lacks the resources to conduct the widespread testing the nation needs to keep up with the number of cases surging across the state. Nigeria also lacks the necessary number of other resources for fighting the virus such as hospital workers, rooms, and ventilators. | The president is elected through universal suffrage. He or she is both the chief of state and head of government, heading the Federal Executive Council, or cabinet. The president is elected to see that the Nigerian Constitution is enacted and that the legislation is applied to the people. The elected president is also in charge of the nation's armed forces and can serve no more than two four-year elected terms. The current President of Nigeria is Muhammadu Buhari, who was elected in 2015 and the current Vice President is Yemi Oshinbajo. The executive branch is divided into Federal Ministries, each headed by a minister appointed by the president. The president must include at least one member from each of the 36 states in his cabinet. The President's appointments are confirmed by the Senate of Nigeria. In some cases, a federal minister is responsible for more than one ministry (for example, Environment and Housing may be combined), or a minister may be assisted by one or more ministers of State. Each ministry also has a Permanent Secretary, who is a senior civil servant. The ministries are responsible for various parastatals (government-owned corporations), such as universities, the National Broadcasting Commission, and the Nigerian National Petroleum Corporation. However, some parastatals are the responsibility of the Office of the Presidency, such as the Independent National Electoral Commission, the Economic and Financial Crimes Commission and the Federal Civil Service Commission. Legislative branch The National Assembly of Nigeria has two chambers: the House of Representatives and the Senate. The House of Representatives is presided over by the Speaker of the House of Representatives. It has 360 members, who are elected for four-year terms in single-seat constituencies. The Senate, which has 109 members, is presided over by the President of the Senate. 108 members are elected for four-year terms in 36 three-seat constituencies, which correspond to the country's 36 states. One member is selected in the single-seat constituency of the federal capital. The legislators are elected to either the House of Representatives or the Senate to be representatives of their constituencies and to pass legislation to benefit the public. The legislative process consists of bills being drafted and presented in either of the two chambers. These bills can only become national law once they are approved by the president of Nigeria who can veto bills. The President of the Senate is currently Ahmed Ibrahim Lawan, who was elected to the senate in 2007, and the Speaker of the House is Femi Gbajabiamila, who has been Nigeria's 9th Speaker of the House of Representatives since 2019. Each member of the National Assembly of Nigeria can only be elected to two four-year terms. Recently, the Legislative branch has been misusing its position as a check on the power of the president and his cabinet. Legislators have been known to utilize their power for not only law-making, but as a means of political intimidation and a tool to promote individual monetary success. Senators are paid a salary equivalent to over $2,200 USD a month, supplemented by expenses of $37,500 USD a month (2018 figures). Judicial branch The judicial branch consists of the Supreme Court of Nigeria, the Court of Appeals, the High Courts, and other trial courts such as the Magistrates', Customary, Sharia and other specialised courts. The National Judicial Council serves as an independent executive body, insulating the judiciary from the executive arm of government. The Supreme Court is presided over by the Chief Justice of Nigeria and thirteen associate justices, who are appointed by the President of Nigeria on the recommendation of the National Judicial Council. These justices are subject to confirmation by the Senate. The judicial branch of the Nigerian government is the only one of the three branches of government in which its members are not elected but are appointed. The judiciary, and the Supreme Court in particular, are intended to uphold the principles and laws of the nation's constitution that was written in 1999. Its goal is to protect the basic rights of the citizens. The current Chief Justice of the Supreme Court is Ibrahim Tanko Muhammad. Democracy in Nigeria Nigeria democratized in 1999 with the start of the Fourth Republic, but has suffered some setbacks to becoming fully democratic. Elites in Nigeria have been found to have more power and influence than average citizens, and as a consequence of this, there has been a great deal of corruption in Nigerian politics and general life. A good sign of democracy in Nigeria is the fact that elections are becoming less fraudulent and there is more party competition. Another indicator of a strong democracy is the presence of a civil society in which citizens have the right to act and speak freely in concert with a strong use of media for everyday life. Furthermore, Nigeria has seen a heightened use of media within the realm of political issues, particularly with the recent [Special Anti-Robbery Squad] SARS protest, indicating a sense of freedom for the public to voice their opinions to the government and the world. Level of freedom According to the 2020 World Press Freedom Index, Nigeria is the 115th most free nation in the world. It has been noted as a nation with perpetuating violence against freedom of speech and press. Nigeria has been found to be a vulnerable nation, both at risk of modern slavery and corruption. The nation is vulnerable due to the effects of inner conflict and governance issues. Freedom House has rated Nigeria as a "partly free" nation. In the last presidential election, the process was tainted by violence, intimidation and vote buying, which have been prevalent in many of the recent elections within Nigeria. Similarly, in the most recent legislative elections, citizens claimed the process was also characterized by intimidation and other inconsistencies. The electoral process and related laws are thought to be enacted in a mostly fair fashion, but there have been instances of intentionally complicating voting and effecting turnout. The people of Nigeria feel as though there is more freedom in their right to have different political parties to represent their opinions. This is exemplified by the vast number of legitimate parties seen in elections. Similarly, Nigerian opposition parties have a legitimate chance to participate in politics and win official positions. In regards to freedom of political expression, Freedom House indicates that opinions and institutions are often heavily influenced by non-governmental, external entities or individuals. In Nigeria, all ethnic groups and religious backgrounds have an equal opportunity to participate in politics, however, there is a lack of women elected into the government, and same-sex relationships were criminalized in 2014. The Nigerian Federal Government's officials like the president and legislators are elected to enact policy and laws, and are usually allowed to do so without interruption, but in recent years, their ability to legislate has been marred by corruption and instability. Corruption has been a major problem for the Nigerian government since its independence from colonial rule. In particular, the oil sector has allowed a great deal of corruption to take place. The government has tried to enact measures to combat corruption that infringes upon the functioning of the state, but have only been quasi-successful. The government has also been rated as lacking in transparency, often not allowing records to be available to the public that should be readily available. Journalism and the media in Nigeria are somewhat free, they are allowed to function independently from the government, but oftentimes those who criticize public figures or offices are arrested or censored. A mafia-like organisation, Black Axe, is involved in international corruption using especially on-line fraud, as reported in BBC article. Religious freedom is allowed in Nigeria, however, the government and even non-governmental organizations have been known to violently responds to groups that openly dissent to the federal government. Religion is a contentious topic in Nigeria because of heated, ongoing conflicts between Christians and Muslims within the state. Freedom House rated the Nigerian federal government well in the category of allowing academic freedom, and the public's ability to express their views even if they disagree without the government without fearing a negative reaction from the government. The Nigerian government was rated moderately on people's ability to assemble, ability to work with human rights, and the existence of unions. The judiciary was rated as moderately free from the government, and lacking in due process in trials and equal treatment of all members of society. People in Nigeria do not have great freedom of movement, and are often subjected to curfews set by the federal government in areas that are at a risk of violence or instability. There is a lack of protection for women in regards to rights to abortion, rape, and domestic abuse under the Nigerian federal |
the Paris Club to buy back the bulk of its owed debts from them, in exchange for a cash payment of roughly US$12 billion. According to a Citigroup report published in February 2011, Nigeria would have the highest average GDP growth in the world between 2010 and 2050. Nigeria is one of two countries from Africa among the 11 Global Growth Generators countries. Overview In 2014, Nigeria changed its economic analysis to account for fast growing contributors to its GDP, such as telecommunications, banking, and its film industry. In 2005, Nigeria reached an agreement with the Paris Club of lending nations to eliminate all of its bilateral external debt. Under the agreement, the lenders will forgive most of the debt, and Nigeria will pay off the remainder with a portion of its energy revenues. Moreover, human capital is underdeveloped—Nigeria ranked 161 out of 189 countries in the United Nations Development Index in 2019—and non-energy-related infrastructure is inadequate. From 2003 to 2007, Nigeria attempted to implement an economic reform program called the National Economic Empowerment Development Strategy (NEEDS). The purpose of the NEEDS was to raise the country's standard of living through a variety of reforms, including macroeconomic stability, deregulation, liberalization, privatization, transparency, and accountability. The NEEDS addressed basic deficiencies, such as the lack of freshwater for household use and irrigation, unreliable power supplies, decaying infrastructure, hindrance to private enterprise, and corruption. NEEDS was intended to create 7 million new jobs, diversify the economy, boost non-energy exports, increase industrial capacity utilization, and improve agricultural productivity. A related initiative on the state level is the State Economic Empowerment Development Strategy (SEEDS). A longer-term economic development program is the United Nations (UN)- sponsored National Millennium Goals for Nigeria. Under the program, which covers the years from 2000 to 2015, Nigeria is committed to achieving a wide range of ambitious objectives involving poverty reduction, education, gender equality, health, the environment, and international development cooperation. In an update released in 2004, the UN found that Nigeria was making progress toward achieving several goals but was falling short on others. Specifically, Nigeria had advanced efforts to provide universal primary education, protect the environment. A requirements for achieving many of these worthwhile objectives is reducing endemic corruption, which obstruct development and stain Nigeria's business environment. President Olusegun Obasanjo's campaign against corruption, which includes the arrest of officials accused of misdeeds and recovering stolen funds, has won praise from the World Bank. In September 2005, Nigeria, with the assistance of the World Bank, began to recover US$458 million of illicit funds that had been deposited in Swiss banks by the late military dictator Sani Abacha, who ruled Nigeria from 1993 to 1998. However, while broad-based progress has been slow, these efforts have begun to become visible in international surveys of corruption. Nigeria's ranking has consistently improved since 2001 ranking 147 out of 180 countries in Transparency International's 2007 Corruption Perceptions Index. The Nigerian economy suffers from an ongoing supply crisis in the power sector. Despite a rapidly growing economy, some of the world's largest deposits of coal, oil and gas and the country's status as Africa's largest oil producer, power supply difficulties are frequently experienced by residents. Two thirds of Nigerians expect living conditions to improve in the coming decades. Economic history This is a chart of trend of gross domestic product of Nigeria at market prices estimated by the International Monetary Fund with figures in USD billions. Figures before 2000 are backwards projections from the 2000–2012 numbers, based on historical growth rates, and should be replaced when data becomes available. The figure for 2014 is derived from a rebasing of economical activity earlier in the year. NOTES: The US dollar exchange rate is an estimated average of the official rate throughout a year, and does not reflect the parallel market rate at which the general population accesses foreign exchange. This rate ranged from a high of 520 in March 2017 to a low of 350 in August 2017, due to a scarcity of forex (oil earnings had dropped by half), and to speculative activity as alleged by the Central Bank. All the while the official rate was pegged at 360. Per capita income (as % of USA) is calculated using data from estimates in the PPP link above, and from census estimates, based on growth rates between census periods. For instance, 2017 GDPs were 1,125 billion (Nigeria) vs. 19,417 billion (USA) and populations were estimated at 320 million vs 190 million. The ratio is therefore (1125/19417) / (190/320), which roughly comes to 0.0975. These are estimates and are intended to get a feel for the relative wealth and standard of living, as well as the market potential of its middle class. This is a chart of trend of the global ranking of the Nigerian economy, in comparison with other countries of the world, derived from the historical List of countries by GDP (PPP). This chart shows the variance in the parallel exchange rate at which the Dollar can be obtained with Naira in Lagos, with "Best" being cheaper for a Nigerian (i.e. stronger Naira). For purchasing power parity comparisons, the US dollar is exchanged at US$1 to 314.27 Nigerian naira (as of 2017). Current GDP per capita of Nigeria expanded 132% in the sixties reaching a peak growth of 283% in the seventies. But this proved unsustainable and it consequently shrank by 66% in the 1980s. In the 1990s, diversification initiatives finally took effect and decadal growth was restored to 10%. Although GDP on a PPP basis did not increase until the 2000s. In 2012, the GDP was composed of the following sectors: agriculture: 40%; services: 30%; manufacturing: 15%; oil: 14%. By 2015, the GDP was composed of the following sectors: agriculture: 18%; services: 55%; manufacturing: 16%; oil: 8% In 2005 Nigeria's inflation rate was an estimated 15.6%. Nigeria's goal under the National Economic Empowerment Development Strategy (NEEDS) program is to reduce inflation to the single digits. By 2015, Nigeria's inflation stood at 9%. In 2005, the federal government had expenditures of US$13.54 billion but revenues of only US$12.86 billion, resulting in a budget deficit of 5%. By 2012, expenditures stood at $31.61 billion, while revenues was $54.48 billion. Economic sectors Agriculture Nigeria ranks sixth worldwide and first in Africa in farm output. The sector accounts for about 18% of GDP and almost one-third of employment. Nigeria has 19 million head of cattle, the largest in Africa. Though Nigeria is no longer a major exporter, due to local consumer boom, it is still a major producer of many agricultural products, including: cocoa, groundnuts (peanuts), rubber, and palm oil. Cocoa production, mostly from obsolete varieties and overage trees has increased from around 180,000 tons annually to 350,000 tons. Major agricultural products include cassava (tapioca), corn, cocoa, millet, palm oil, peanuts, rice, rubber, sorghum, and yams. In 2003, livestock production, in order of metric tonnage, featured eggs, milk, beef and veal, poultry, and pork, respectively. In the same year, the total fishing catch was 505.8 metric tons. Roundwood removals totaled slightly less than 70 million cubic meters, and sawnwood production was estimated at 2 million cubic meters. The agricultural sector suffers from extremely low productivity, reflecting reliance on antiquated methods. Agriculture has failed to keep pace with Nigeria's rapid population growth, so that the country, which once exported food, now imports a significant amount of food to sustain itself. However, efforts are being made towards making the country food sufficient again. Nigeria produced in 2018: 59.4 million tons of cassava (the largest producer in the world); 47.5 million tons of yams (largest producer in the world); 10.1 million tons of maize (14th largest producer in the world); 7.8 million tons of palm oil (4th largest producer in the world, second only to Indonesia, Malaysia and Thailand); 6.8 million tons of sorghum (2nd largest producer in the world, second only to the United States); 6.8 million tons of rice (14th largest producer in the world); 4 million tons of sweet potatoes (3rd largest producer in the world, second only to China and Malawi); 3.9 million tons of tomatoes (11th largest producer in the world); 3.3 million tons of taro (largest producer in the world); 3 million tons of plantains (5th largest producer in the world); 2.8 million tons of peanuts (3rd largest producer in the world, second only to China and India); 2.6 million tons of cowpeas(largest producer in the world); 2.2 million tons of millet (4th largest producer in the world, second only to India, Niger and Sudan); 2 million tons of okra (2nd largest producer in the world, second only to India); 1.6 million tons of pineapples (7th largest producer in the world); 1.4 million tons of sugarcane; 1.3 million tonnes of potatoes; 949 thousand tons of mangos (including mangosteen and guava); 938 thousand tons of onions; 833 thousand tons of papayas (6th largest producer in the world); 758 thousand tons of soy; 747 thousand tons of green peppers; 585 thousand tons of egusi; 572 thousand tons of sesame seeds (4th largest producer in the world, losing only to Sudan, Myanmar and India); 369 thousand tons of ginger (3rd largest producer in the world, losing only to India and China); 332 thousand tons of cocoa (4th largest producer in the world, second only | Oil - consumption: (2003 est.) Overseas remittances A major source of foreign exchange earnings for Nigeria are remittances sent home by Nigerians living abroad. In 2014, 17.5 million Nigerians lived in foreign countries, with the UK and the USA having more than 2 million Nigerians each. According to the International Organization for Migration, Nigeria witnessed a dramatic increase in remittances sent home from overseas Nigerians, going from US$2.3 billion in 2004 to $17.9 billion in 2007, representing 6.7% of GDP. The United States accounts for the largest portion of official remittances, followed by the United Kingdom, Italy, Canada, Spain and France. On the African continent, Egypt, Equatorial Guinea, Chad, Libya and South Africa are important source countries of remittance flows to Nigeria, while China is the biggest remittance-sending country in Asia. Labour force In 2015, Nigeria had a labour force of 74 million. In 2003, the unemployment rate was 10.8% overall; by 2015, unemployment stood at 6.4%. Since 1999, the Nigerian Labor Congress (NLC) a union umbrella organization, has called six general strikes to protest domestic fuel price increases. However, in March 2005 the government introduced legislation ending the NLC's monopoly over union organizing. In December 2005, the Nigerian Labour Congress (NLC) was lobbying for an increase in the minimum wage for federal workers. The existing minimum wage, which was introduced six years earlier but has not been adjusted since, has been whittled away by inflation to only US$42.80 per month. According to the International Organization for Migration, the number of immigrants residing in Nigeria has more than doubled in recent decades – from 477,135 in 1991 to 971,450 in 2005. The majority of immigrants in Nigeria (74%) are from neighbouring Economic Community of West African States (ECOWAS), and that this number has increased considerably over the last decade, from 63% in 2001 to 97% in 2005. The government has to pay a high interest rate on bonds in part because of the high fertility rate; there are many children and less savings. Human capital As of 2019, Nigeria's HDI (Human Development Index) is ranked 161st at 0.539. The comparative value for Sub-Saharan Africa is 0.547, 0.926 for the US, and 0.737 for the world average. The value for the education index is 0.499, compared to the average in the US of 0.900. The expected years of schooling in Nigeria is 10.0 (16.3 in the US), while the mean years of schooling for adults over 25 years is 6.7 years (13.4 years in the US). Additionally, Nigeria is also facing a relatively high inequality, worsening the problem regarding the formation of human capital. Government policy Inflation In 2016, the black market exchange rate of the Naira was about 60% above the official rate. The central bank releases about $200 million each week at the official exchange rate. However, some companies cite that budgets now include a 30% "premium" to be paid to central bank officials to get dollars. Nigeria’s inflation rate rose to 15.63 per cent in December 2021 compared to 15.40 per cent in November, the National Bureau of Statistics announced on January 17th, 2022. The statistics office said the prices of goods and services, measured by the Consumer Price Index, increased by 15.63 per cent in December 2021 when compared to December 2020. According to the NBS, this rise in the food index was caused by increases in prices of bread and cereals, food products, meat, fish, potatoes, yam and other tubers, soft drinks and fruits. Foreign economic relations Nigeria's foreign economic relations revolve around its role in supplying the world economy with oil and natural gas, even as the country seeks to diversify its exports, harmonize tariffs in line with a potential customs union sought by the Economic Community of West African States (ECOWAS), and encourage inflows of foreign portfolio and direct investment. In October 2005, Nigeria implemented the ECOWAS common external tariff, which reduced the number of tariff bands. Prior to this revision, tariffs constituted Nigeria's second largest source of revenue after oil exports. In 2005 Nigeria achieved a major breakthrough when it reached an agreement with the Paris Club to eliminate its bilateral debt through a combination of write-downs and buybacks. Nigeria joined the Organization of the Petroleum Exporting Countries in July 1971 and the World Trade Organization in January 1995. If the global transition to renewable energy is completed and international demand for Nigeria's petroleum resources ceases, Nigeria will be significantly weakened. It is ranked 149 out of 156 countries in the index of Geopolitical Gains and Losses after energy transition (GeGaLo). External trade In 2017, Nigeria imported about US$34.2 billion of goods. In 2017 the leading sources of imports were China (28%), the Belgium-Luxembourg (8.9%), the Netherlands (8.3%), South Korea (6.4%), the United States (6.0%) and the India (4.6%). Principal imports were manufactured goods, machinery and transport equipment, chemicals, and food and live animals. In 2017, Nigeria exported about US$46.68 billion of goods. In 2017, the leading destinations for exports were India (18%), the United States (14%), Spain (9.7%), France (6.0%) and the Netherlands (4.9%). In 2017 oil accounted for 83% of merchandise exports. Natural rubber and cocoa are the country's major agricultural exports. In 2005, Nigeria posted a US$26 billion trade surplus, corresponding to almost 20% of gross domestic product. In 2005, Nigeria achieved a positive current account balance of US$9.6 billion. The Nigerian currency is the naira (NGN). As of June 2006, the exchange rate was about US$1=NGN128.4. As of June 2019, it stands at US$1 =NGN357. In recent years, Nigeria has expanded its trade relations with other developing countries such as India. Nigeria is the largest African crude oil supplier to India – it annually exports to India valued at US$10 billion annually. India is the largest purchaser of Nigeria's oil which fulfills 20% to 25% of India's domestic oil demand. Indian oil companies are also involved in oil drilling operations in Nigeria and have plans to set up refineries there. The trade volume between Nigeria and the United Kingdom rose by 35% from USD6.3 billion in 2010 to USD8.5 billion in 2011. External debt In 2012, Nigeria's external debt was an estimated $5.9 billion and N5.6 trillion domestic - putting total debt at $44 billion. In April 2006, Nigeria became the first African country to fully pay off its debt owed to the Paris Club. This was structured as a debt write off of approximately $18 billion and a cash payment of approximately $12 billion. Foreign investment In 2012, Nigeria received a net inflow of US$85.73 billion of foreign direct investment (FDI), much of which came from Nigerians in the diaspora. Most FDI is directed toward the energy and banking sectors. Any public designed to encourage inflow of foreign capital is capable of generating employment opportunities within the domestic economy. The Nigerian Enterprises Promotion (NEP) Decree of 1972 |
(ITU, Geneva), 15 December 2011. Retrieved 2 January 2014.</ref> Connected lines: 348,933 fixed wired/wireless lines (July 2016). 222,440,207 mobile cellular (GSM) lines (July 2016). 3,611,926 mobile (CDMA) lines (July 2016). 226,426,215 total connected lines Active lines: 164,114 fixed wired/wireless lines (July 2016). 149,708,077 Mobile cellular (GSM) lines (July 2016). 371,613 mobile (CDMA) lines (July 2016). 150,262,066 total active lines Installed capacity: 11,384,677 fixed wired/wireless lines (June 2013). 204,242,114 mobile (GSM) lines (June 2013). 18,400,000 mobile (CDMA) lines (June 2013). 234,026,791 total lines Teledensity: ~86 combined fixed and mobile lines per 100 persons (June 2013). ~1 fixed line per 100 persons (2010). ~60 mobile lines per 100 persons (2010). Telephone system: further expansion and modernization of the fixed-line telephone network is needed; network quality remains a problem; the addition of a second fixed-line provider in 2002 resulted in faster growth but subscribership remains only about 1 per 100 persons; mobile-cellular services growing rapidly, in part responding to the shortcomings of the fixed-line network; multiple cellular providers operate nationally with a subscribership approaching 60 per 100 persons (2010); Satellite earth stations: 3 Intelsat (2 Atlantic Ocean and 1 Indian Ocean) (2010); Submarine cables: SAT-3/WASC/SAFE links countries along the west coast of Africa to each other and on to Europe and Asia, ACE links countries along the west coast of Africa to each other and on to France, GLO-1 links countries along the west coast of Africa to each other and on to the United Kingdom, Main One links countries along the west coast of Africa to each other and on to Portugal. Deregulation of the mobile phone market has led to the introduction of Global System for Mobile Communication (GSM) network providers operating on the 900/1800 MHz spectrum, MTN Nigeria, Airtel Nigeria, Globacom, and 9mobile. Use of cell-phones has soared, and has mostly replaced the unreliable fixed line services of Nigerian Telecommunications Limited (NITEL). With the expiration of the exclusivity period of the main GSM network providers, Nigeria's telecom regulator, the Nigerian Communications Commission (NCC), introduced the Unified Licensing Regime. It was hoped that telecoms with unified licences would be able to provide fixed and mobile telephony and Internet access as well as any other communications service they choose to offer. In March 2011 the NCC started registering SIM cards. The exercise was expected to last until 28 September 2011. In 2015 the NTC fined MTN Nigeria a record $5.2 billion for issuing 5.2 million unregistered and pre-registered subscriber Identification Module Cards (SIMs). In 2017 the NTC sett up a 12-member task force in response to renewed proliferation of Unregistered and pre-registered SIM cards. The unregistered cards are considered a threat to Nigerian national security. After a decade of failed privatization attempts, the incumbent national telecom NITEL and its mobile arm have been sold to NATCOM and now rebranded as NTEL. Internet Top-level domain: .ng Internet users: 122 million users, 7th in the world (2019);'' 67.0 million users, 8th in the world (2015); 55.9 million users, 8th in the world; 32.9% of the population, 128th in the world (2012); 44.0 million users, 9th in the world (2009); 5.0 million users, 40th in the world (2005). Fixed broadband: 15,311 subscriptions, 136th in the world; less than 0.05% of the population, 185th in the world (2012). Wireless broadband: 17.3 million subscriptions, 18th in the world; 10.2% of the population, 91st in the world (2012). Internet hosts: 1,234 hosts, 169th in the world (2012); 1,549 hosts, 134th in the world (2006). IPv4: 1.0 million addresses allocated, 75th in the world, less than 0.05% of the world total, 5.9 addresses per 1000 people (2012). Internet service providers: ~100 ISPs (2018); ~400 ISPs (2010); ~11 ISPs (2000). Internet censorship and surveillance Listed by the OpenNet Initiative as no evidence of Internet filtering in all four areas for which they test (political, social, conflict/security, and Internet tools) in October 2009. There are few government restrictions on access to the Internet or credible reports the government monitors e-mail or Internet chat rooms. Although the constitution and law provide for freedom of speech, including for members of the press, the government sometimes restricts these rights in practice. Libel is a civil offense and requires defendants to prove the truth of opinion or value judgment contained in news reports or commentaries. Penalties include two years' imprisonment and possible fines. Militant groups such as Boko Haram threaten, attack, and kill journalists in connection with their reporting of the sect's activities. Journalists practice self-censorship. Reporting on political corruption and security issues has proved to be particularly sensitive. On 24 October 2012 police in Bauchi State arraigned civil servant Abbas Ahmed Faggo before a court for allegedly defaming the character of Governor Isa Yuguda after he posted messages on his Facebook account | connected lines Active lines: 164,114 fixed wired/wireless lines (July 2016). 149,708,077 Mobile cellular (GSM) lines (July 2016). 371,613 mobile (CDMA) lines (July 2016). 150,262,066 total active lines Installed capacity: 11,384,677 fixed wired/wireless lines (June 2013). 204,242,114 mobile (GSM) lines (June 2013). 18,400,000 mobile (CDMA) lines (June 2013). 234,026,791 total lines Teledensity: ~86 combined fixed and mobile lines per 100 persons (June 2013). ~1 fixed line per 100 persons (2010). ~60 mobile lines per 100 persons (2010). Telephone system: further expansion and modernization of the fixed-line telephone network is needed; network quality remains a problem; the addition of a second fixed-line provider in 2002 resulted in faster growth but subscribership remains only about 1 per 100 persons; mobile-cellular services growing rapidly, in part responding to the shortcomings of the fixed-line network; multiple cellular providers operate nationally with a subscribership approaching 60 per 100 persons (2010); Satellite earth stations: 3 Intelsat (2 Atlantic Ocean and 1 Indian Ocean) (2010); Submarine cables: SAT-3/WASC/SAFE links countries along the west coast of Africa to each other and on to Europe and Asia, ACE links countries along the west coast of Africa to each other and on to France, GLO-1 links countries along the west coast of Africa to each other and on to the United Kingdom, Main One links countries along the west coast of Africa to each other and on to Portugal. Deregulation of the mobile phone market has led to the introduction of Global System for Mobile Communication (GSM) network providers operating on the 900/1800 MHz spectrum, MTN Nigeria, Airtel Nigeria, Globacom, and 9mobile. Use of cell-phones has soared, and has mostly replaced the unreliable fixed line services of Nigerian Telecommunications Limited (NITEL). With the expiration of the exclusivity period of the main GSM network providers, Nigeria's telecom regulator, the Nigerian Communications Commission (NCC), introduced the Unified Licensing Regime. It was hoped that telecoms with unified licences would be able to provide fixed and mobile telephony and Internet access as well as any other communications service they choose to offer. In March 2011 the NCC started registering SIM cards. The exercise was expected to last until 28 September 2011. In 2015 the NTC fined MTN Nigeria a record $5.2 billion for issuing 5.2 million unregistered and pre-registered subscriber Identification Module Cards (SIMs). In 2017 the NTC sett up a 12-member task force in response to renewed proliferation of Unregistered and pre-registered SIM cards. The unregistered cards are considered a threat to Nigerian national security. After a decade of failed privatization attempts, the incumbent national telecom NITEL and its mobile arm have been sold to NATCOM and now rebranded as NTEL. Internet Top-level domain: .ng Internet users: 122 million users, 7th in the world (2019);'' 67.0 million users, 8th in the world (2015); 55.9 million users, 8th in the world; 32.9% of the population, 128th in the world (2012); 44.0 million users, 9th in the world (2009); 5.0 million users, 40th in the world (2005). Fixed broadband: 15,311 subscriptions, 136th in the world; less than 0.05% of the population, 185th in the world (2012). Wireless broadband: 17.3 million subscriptions, 18th in the world; |
1,194 km of expressways) unpaved: 134,326 km (1998 est.) Note: some paved roads have lost their asphalt surface and are in very poor condition or have reverted to being gravel roads. Some of the road system is barely usable, especially in high rainfall areas of the south. International highways Nigeria's strategic location and size results in four routes of the Trans-African Highway network using its national road system: The Trans-Sahara Highway to Algeria is almost complete but border security issues may hamper its use in the short term. The Trans-Sahelian Highway to Dakar is substantially complete. The Trans–West African Coastal Highway starts in Nigeria, connecting it westwards to Benin, Togo, Ghana and Ivory Coast with feeder highways to landlocked Burkina Faso and Mali. When construction in Liberia and Sierra Leone is finished, the highway will continue seven other Economic Community of West African States (ECOWAS) nations further west. The Lagos-Mombasa Highway has been awaited for many decades to kick-start trade across the continent. It does provide improved highway links to neighbouring Cameroon, but its continuation across DR Congo to East Africa is lacking, as are highways from Cameroon to Central Africa and Southern Africa, which could boost trade within the continent. Waterways Nigeria has 8,600 km of inland waterways. The longest are the Niger River and its tributary, the Benue River but the most used, especially by larger powered boats and for commerce, are in the Niger Delta and all along the coast from Lagos Lagoon to Cross River. Pipelines In 2004 Nigeria had 105 kilometers of pipelines for condensates, 1,896 kilometers for natural gas, 3,638 kilometers for oil, and 3,626 kilometers for refined products. Various pipeline projects are planned to expand the domestic distribution of natural gas and to export natural gas to Benin, Ghana, Togo through the West African Gas Pipeline, and, potentially, even to Algeria (where Mediterranean export terminals are located) by proposed Trans-Saharan gas pipeline. Energy pipelines are subject to sabotage by militant groups or siphoning by thieves. crude oil 2,042 km; petroleum products 3,000 km; natural gas 500 km Ports and harbors The Nigerian Ports Authority (NPA) is responsible for managing Nigeria's ports, some of which have fallen behind international standards in terms of the quality of facilities and operational efficiency. Recognizing that the government lacks the funding and expertise to modernize facilities and run the ports efficiently, the NPA is pursuing partial port privatization by means of granting concessions to private port operators. Under the terms of concession agreements, the government would transfer operating rights to private companies for a finite number of years without forgoing ownership of the port land. Nigeria's principal container port is the port of Lagos, which handles about 5.75 million tons of cargo each year. The port, which consists of separate facilities at Apapa and Tin Can Island, has a rail connection to points inland. Port Harcourt, a transshipment port located 66 kilometers from the Gulf of Guinea along the Bonny River in the Niger Delta, handles about 815,000 tons of cargo each year and also has a railway connection. Both ports are not only responsible for Nigeria's seaborne trade but also serve inland countries such as Niger and Chad. A new port is under construction at Onne about 25 kilometers south of Port Harcourt. Relatively modern and efficient terminals managed by multinational oil companies handle most oil and gas exports. Atlantic Ocean Calabar Lagos - railhead Tin Can Island Port Onne - site of Federal Ocean Terminal - railhead under construction Port Harcourt - railhead Sapele Koko Warri The Lekki Port is under construction. River Ports Onitsha river port, Anambra state, located on the Niger River Burutu river port, Delta state, located on the Forçados River Oguta river port, Imo state, on the ]Oguta Lake along the Njaba River Lokoja river port, Kogi state on the Niger | to Cross River. Pipelines In 2004 Nigeria had 105 kilometers of pipelines for condensates, 1,896 kilometers for natural gas, 3,638 kilometers for oil, and 3,626 kilometers for refined products. Various pipeline projects are planned to expand the domestic distribution of natural gas and to export natural gas to Benin, Ghana, Togo through the West African Gas Pipeline, and, potentially, even to Algeria (where Mediterranean export terminals are located) by proposed Trans-Saharan gas pipeline. Energy pipelines are subject to sabotage by militant groups or siphoning by thieves. crude oil 2,042 km; petroleum products 3,000 km; natural gas 500 km Ports and harbors The Nigerian Ports Authority (NPA) is responsible for managing Nigeria's ports, some of which have fallen behind international standards in terms of the quality of facilities and operational efficiency. Recognizing that the government lacks the funding and expertise to modernize facilities and run the ports efficiently, the NPA is pursuing partial port privatization by means of granting concessions to private port operators. Under the terms of concession agreements, the government would transfer operating rights to private companies for a finite number of years without forgoing ownership of the port land. Nigeria's principal container port is the port of Lagos, which handles about 5.75 million tons of cargo each year. The port, which consists of separate facilities at Apapa and Tin Can Island, has a rail connection to points inland. Port Harcourt, a transshipment port located 66 kilometers from the Gulf of Guinea along the Bonny River in the Niger Delta, handles about 815,000 tons of cargo each year and also has a railway connection. Both ports are not only responsible for Nigeria's seaborne trade but also serve inland countries such as Niger and Chad. A new port is under construction at Onne about 25 kilometers south of Port Harcourt. Relatively modern and efficient terminals managed by multinational oil companies handle most oil and gas exports. Atlantic Ocean Calabar Lagos - railhead Tin Can Island Port Onne - site of Federal Ocean Terminal - railhead under construction Port Harcourt - railhead Sapele Koko Warri The Lekki Port is under construction. River Ports Onitsha river port, Anambra state, located on the Niger River Burutu river port, Delta state, located on the Forçados River Oguta river port, Imo state, on the ]Oguta Lake along the Njaba River Lokoja river port, Kogi state on the Niger River Baro river port, Niger State on the Niger River. The Benin river port on the Benin river in Benin, Edo state; and Makurdi river port on the Benue River in Benue State are under construction. Bridges |
the NAF faces a number of domestic challenges which continue to undermine stability within Nigeria and the region as a whole. Some of these threats include the ongoing conflict against the jihadist rebel group, Boko Haram in northeastern Nigeria, which has been in effect since July 2009. Likewise, Nigeria has been engaged in a long-running anti-piracy campaign in the Niger Delta, which has threatened the vital petroleum industry in the country, which is the source of 40% of Nigeria's exports and 85% of the government's revenue. Compounding this state of affairs is the role corruption plays in the ongoing attempts to strengthen the armed forces. Corruption has historically weakened the Nigerian military's capacity to face internal security threats, and is cited as being responsible for the continued longevity of rebels and terrorists operating throughout the nation. In spite of these challenges to its operational readiness, the Nigerian Armed Forces have committed to a number of wide-ranging modernization programs to bolster the discipline and firepower of its troops. This includes the acquisition of new armored vehicles, combat aircraft and aerial reconnaissance drones, and the refurbishing of naval vessels which had suffered from a prolonged periods of poor or minimal maintenance. These trends in the development of the armed forces as a fighting force, as well as efforts to combat corruption within the ranks of military personnel and government bureaucracy, have been critically important in the ability of Nigeria to confront challenges to its national security and stability in the wider region of West Africa as a whole. History The Nigerian Armed Forces origins lie in the elements of the Royal West African Frontier Force that became Nigerian when independence was granted in 1960. In 1956 the Nigeria Regiment of the Royal West African Frontier Force (RWAFF) was renamed the Nigerian Military Forces, RWAFF, and in April 1958 the colonial government of Nigeria took over from the British War Office control of the Nigerian Military Forces. Since its creation the Nigerian military has fought in a civil war – the conflict with Biafra in 1967–70 – and sent peacekeeping forces abroad both with the United Nations and as the backbone of the Economic Community of West African States (ECOWAS) Cease-fire Monitoring Group (ECOMOG) in Liberia and Sierra Leone. It has also seized power twice at home (1966 & 1983). The great expansion of the military during the civil war further entrenched the existing military hold on Nigerian society carried over from the first military regime. In doing so, it played an appreciable part in reinforcing the military's nearly first-among-equals status within Nigerian society, and the linked decline in military effectiveness. Olusegun Obasanjo, who by 1999 had become president, bemoaned the fact in his inaugural address that year: ‘... Professionalism has been lost... my heart bleeds to see the degradation in the proficiency of the military.’ Training establishments in Nigeria include the prestigious officer entry Nigerian Defence Academy at Kaduna, the Armed Forces Command and Staff College, Jaji, and the National War College at Abuja. The U.S. commercial military contractor Military Professional Resources Inc. has been involved from around 1999–2000 in advising on civil-military relations for the armed forces. Legal standing The roles of a country's armed forces are entrenched in her Constitution. The defence of the territorial integrity and other core interests of the nation form the major substance of such roles. Section 217-220 of the 1999 Constitution of Nigeria addresses the Nigerian Armed Forces: (1) There shall be an armed forces for the Federation which shall consist of an army, a navy, an air force, and such other branches of the armed forces of the Federation as may be established by an Act of the National Assembly. (2) The Federation shall, subject to an Act of the National Assembly made in that behalf, equip and maintain the armed forces as may be considered adequate and effective for the purpose of – (a) defending Nigeria from external aggression; (b) maintaining its territorial integrity and securing its borders from violation on land, sea, or air; (c) Suppress insurrection and act in aid of civil authorities to restore order when called upon to do so by the President but subject to such conditions as may be prescribed by an Act of the National Assembly. (d) Perform such other functions as may be prescribed by an act of the National Assembly. (3) The composition of the officer corps and other ranks of the armed forces of the Federation shall reflect the federal character of Nigeria. Army The Nigerian Army (NA) is the land branch of the Nigerian Armed Forces and the largest among the armed forces. Major formations include the 1st Division, the 2nd Division, the 3rd Armoured Division, 81st Division, 82nd Division, and newly formed 8th, 7th and 6th, Divisions. The Nigerian army is headed currently by Major General Farouk Yahaya who was appointed by President Muhammadu Buhari. The Nigerian Army has been playing a major role in defence of Nigeria Democracy since the first republic till date. Navy The Nigerian Navy (NN) is the sea branch of the Nigerian Armed Forces. The Nigerian Navy command structure today consists of the Naval Headquarters in Abuja, three operational commands with headquarters in Lagos, Calabar, and Bayelsa. Training command's headquarters are located in Lagos, the commercial capital of Nigeria, but with training facilities spread all over Nigeria. | Africa. According to Global Firepower, the Nigerian Armed Forces are the fourth-most powerful military in Africa, and ranked 35th on its list internationally. The Nigerian Armed Forces were established in 1960 as the successor to the combat units of the Royal West African Frontier Force stationed in the country, which had previously served as the British Empire's multi-battalion field force during Nigeria's protectorate period. Shortly after its formation, the NAF was engaged in combat operations against the secessionist state of Biafra during the Nigerian Civil War from 1967 to 1970. At this point in time, the Nigerian military ballooned in strength from 85,000 personnel in 1967, to more than 250,000 troops by the war's end. In the years following the civil war, the Nigerian Armed Forces were halved in size from its post-war height to approximately 125,000 men. In spite of this contraction in the size and funding of its Armed Forces, Nigeria would boast the only military in West Africa capable of engaging in foreign military operations, such as during its intervention in Liberian civil war in 1990. Nigeria's Armed Forces would continue to remain an active element in combat operations throughout the African continent over the proceeding decades, with notable engagements including its 2017 involvement as part of the ECOWAS military intervention in the Gambia. Today, the NAF faces a number of domestic challenges which continue to undermine stability within Nigeria and the region as a whole. Some of these threats include the ongoing conflict against the jihadist rebel group, Boko Haram in northeastern Nigeria, which has been in effect since July 2009. Likewise, Nigeria has been engaged in a long-running anti-piracy campaign in the Niger Delta, which has threatened the vital petroleum industry in the country, which is the source of 40% of Nigeria's exports and 85% of the government's revenue. Compounding this state of affairs is the role corruption plays in the ongoing attempts to strengthen the armed forces. Corruption has historically weakened the Nigerian military's capacity to face internal security threats, and is cited as being responsible for the continued longevity of rebels and terrorists operating throughout the nation. In spite of these challenges to its operational readiness, the Nigerian Armed Forces have committed to a number of wide-ranging modernization programs to bolster the discipline and firepower of its troops. This includes the acquisition of new armored vehicles, combat aircraft and aerial reconnaissance drones, and the refurbishing of naval vessels which had suffered from a prolonged periods of poor or minimal maintenance. These trends in the development of the armed forces as a fighting force, as well as efforts to combat corruption within the ranks of military personnel and government bureaucracy, have been critically important in the ability of Nigeria to confront challenges to its national security and stability in the wider region of West Africa as a whole. History The Nigerian Armed Forces origins lie in the elements of the Royal West African Frontier Force that became Nigerian when independence was granted in 1960. In 1956 the Nigeria Regiment of the Royal West African Frontier Force (RWAFF) was renamed the Nigerian Military Forces, RWAFF, and in April 1958 the colonial government of Nigeria took over from the British War Office control of the Nigerian Military Forces. Since its creation the Nigerian military has fought in a civil war – the conflict with Biafra in 1967–70 – and sent peacekeeping forces abroad both with the United Nations and as the backbone of the Economic Community of West African States (ECOWAS) Cease-fire Monitoring Group (ECOMOG) in Liberia and Sierra Leone. It has also seized power twice at home (1966 & 1983). The great expansion of the military during the civil war further entrenched the existing military hold on Nigerian society carried over from the first military regime. In doing so, it played an appreciable part in reinforcing the military's nearly first-among-equals status within Nigerian society, and the linked decline in military effectiveness. Olusegun Obasanjo, who by 1999 had become president, bemoaned the fact in his inaugural address that year: ‘... Professionalism has been lost... my heart bleeds to |
checks political stability of any African countries and encourages them to be holding regional meetings for the union. Nigeria backed the African National Congress (ANC) by taking a committed tough line with regard to the South African government and their military actions in southern Africa. Nigeria and Organisation for African Unity (OAU, now the African Union), has tremendous influence in West Africa nations and Africa on the whole. Nigeria has additionally founded regional cooperative efforts in West Africa, functioning as standard-bearer for ECOWAS and ECOMOG, economic and military organisations, respectively. Similarly, when civil war broke out in Angola after the country gained independence from Portugal in 1975, Nigeria mobilised its diplomatic influence in Africa in support of the Popular Movement for the Liberation of Angola (MPLA). That support helped tip the balance in their favour, which led to OAU recognition of the MPLA over the National Union for the Total Independence of Angola. Nigeria extended diplomatic support to another cause, Sam Nujoma's Southwest Africa People's Organization in Namibia, to stall the apartheid South African-installed government there. In 1977, the new General Olusegun Obasanjo's military regime donated $20 million to the Zimbabwean movement against the apartheid government of Rhodesia. Nigeria also sent military equipment to Mozambique to help the newly independent country suppress the South African-backed Mozambican National Resistance guerrillas. Nigeria also provided some military training at the Kaduna first mechanised army division and other material support to Joshua Nkomo and Robert Mugabe's guerrilla forces during the Zimbabwe War in 1979 against the white minority rule of Prime Minister Ian Douglas Smith, which was backed by the apartheid -government of South Africa. Due to mismanagement of its economy and technology, Nigeria announced that it was launching a nuclear programme of "unlimited scope" of its own but failed. After the Nigerian Independence in 1960, Nigeria demonstrated its seriousness in improving the economy for the people and embarked on nationalizing some multi-national companies that traded with and broke the economic/trade embargo of the apartheid South African regime, the local operations of Barclays Bank was nationalised after that bank ignored the strong protests by the Nigeria populace. Nigeria also nationalised the British Petroleum (BP) for supplying oil to South Africa. In 1982, the Alhaji Shehu Shagari government urged the visiting Pontiff Pope John Paul II to grant audience to the leaders of Southern Africa guerrilla organisations Oliver Tambo of the ANC and Sam Nujoma of SWAPO. In December 1983, the new Major General Muhammadu Buhari regime announced that Nigeria could no longer afford an apartheid government in Africa. however, Nigeria being the foremost black nation on Earth due to its population, Nigeria has great potential and will soon grow to be a force to reckon with on the global stage. Nigeria and West Africa In pursuing the goal of regional economic cooperation and development, Nigeria helped create ECOWAS, which seeks to harmonise trade and investment practices for its 16 West African member countries, ultimately achieve a full customs union, and establish a single currency. Nigeria also has taken the lead in articulating the views of developing nations on the need for modification of the existing international economic order. Nigeria has played a central role in the ECOWAS efforts to end the civil war in Liberia and contributed the bulk of | Nigeria also provided some military training at the Kaduna first mechanised army division and other material support to Joshua Nkomo and Robert Mugabe's guerrilla forces during the Zimbabwe War in 1979 against the white minority rule of Prime Minister Ian Douglas Smith, which was backed by the apartheid -government of South Africa. Due to mismanagement of its economy and technology, Nigeria announced that it was launching a nuclear programme of "unlimited scope" of its own but failed. After the Nigerian Independence in 1960, Nigeria demonstrated its seriousness in improving the economy for the people and embarked on nationalizing some multi-national companies that traded with and broke the economic/trade embargo of the apartheid South African regime, the local operations of Barclays Bank was nationalised after that bank ignored the strong protests by the Nigeria populace. Nigeria also nationalised the British Petroleum (BP) for supplying oil to South Africa. In 1982, the Alhaji Shehu Shagari government urged the visiting Pontiff Pope John Paul II to grant audience to the leaders of Southern Africa guerrilla organisations Oliver Tambo of the ANC and Sam Nujoma of SWAPO. In December 1983, the new Major General Muhammadu Buhari regime announced that Nigeria could no longer afford an apartheid government in Africa. however, Nigeria being the foremost black nation on Earth due to its population, Nigeria has great potential and will soon grow to be a force to reckon with on the global stage. Nigeria and West Africa In pursuing the goal of regional economic cooperation and development, Nigeria helped create ECOWAS, which seeks to harmonise trade and investment practices for its 16 West African member countries, ultimately achieve a full customs union, and establish a single currency. Nigeria also has taken the lead in articulating the views of developing nations on the need for modification of the existing international economic order. Nigeria has played a central role in the ECOWAS efforts to end the civil war in Liberia and contributed the bulk of the ECOWAS peacekeeping forces sent there in 1990. Nigeria also has provided the bulk of troops for ECOMOG forces in Sierra Leone. Nigeria has enjoyed generally good relations with its immediate neighbours. Nigeria has actively played a leading role in West Africa, with enormous military power, Nigeria has been perpetual in it's aim of promoting peace and stability in Africa's most prosperous region for more than three decades. Nigeria and International Organisations Nigeria is a member of the following |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.