Wednesday, December 31, 2008

Namibia- Diamonds

Namibia is world-renowned for its gem quality placer diamonds that occur along the Orange River as well as, onshore and offshore along the coastline of Namibia. The Namibian diamonds were originally transported via the Orange River into the Atlantic Ocean and distributed northwards by long-shore currents. Diamonds typically occur as placers within raised and“drowned” beach terraces, gullies in the bedrock, and eluvial deposits in wind corridors within southern Namibia. As onshore diamond reserves are depleted, future diamond production will predominately come from the seabed. Mid-water to deep-water mining operations requires sophisticated marine vessels and crawlers that are capable of retrieving diamondiferous gravels/sands from the seafloor.

The major diamond producing company in Namibia is Namdeb Diamond Corporation (Pty) Ltd, which accounts for an average of 1.6 million carats per annum. Other companies mining diamonds in Namibia include Sakawe Mining Corporation (Samincor) and Diamond Fields Namibia (Pty) Ltd.

Tuesday, December 23, 2008

Parking Clocks

Charges apply in most of the Council's town and village car parks. In order to help motorists (particularly residents and those who work in our towns/villages) the council operates a clock parking scheme.
Car Park Photo
Car Parks at Ringwood

The display of a correctly set valid clock, will allow motorists to park in the relevant NFDC car parks without incurring a further charge. Motorists who do not wish to purchase a clock can pay at the ticket machine. Holders of the clock are subject to the waiting restrictions which apply in NFDC car parks.Parking is free for those displaying a valid Blue Disabled Parking Badge.

Thursday, December 18, 2008

Funds of hedge funds

"Funds of hedge funds," a relatively new type of investment product, are investment companies that invest in hedge funds. Some, but not all, register with the SEC and file semi-annual reports. They often have lower minimum investment thresholds than traditional, unregistered hedge funds and can sell their shares to a larger number of investors. Like hedge funds, funds of hedge funds are not mutual funds. Unlike open-end mutual funds, funds of hedge funds offer very limited rights of redemption. And, unlike ETFs, their shares are not typically listed on an exchange.

Monday, December 08, 2008

FDI measured and compared between countries

There is no one simple method of measuring inward investment, but most methods of measuring it either focus on the number of projects or jobs, or the financial value. Financial measurements are either of stocks or flows.

FDI stocksFDI stocks measure the level of cumulative FDI stock of capital investment by foreign enterprises at a single point of time that takes account of both new investment and disinvestment. The United Nations Conference on Trade and Development (UNCTAD) produces annual statistics of global stock movements in their World Investment Report.

FDI flowsFDI flows are new investments by foreign enterprises made during a period of time – either by calendar or tax year. While much inward investment is included in FDI flow statistics, not all of it will be. For example, if an inward investor decided to expand its facilities in the UK but used local finance, this would not appear in FDI flow statistics as it involves no inflow of
money to the country.

Friday, December 05, 2008

Italian painting of the 16th century

In a brief moment of equilibrium, artists achieved the harmonious balance and elevated conception that is the High Renaissance. In Rome this was shortly replaced by the self-conscious artifice of the style we call mannerism. Venice, on the other hand, produced a succession of artists devoted to color, light, and a more sensual approach to paint.

In the early sixteenth century, the center of patronage in the arts shifted to Rome. Pope Julius II recruited the finest artists of the day for his ambitious building program; Raphael and Michelangelo continued and expanded Leonardo's High Renaissance style, characterized by classical balance, controlled movement, and an elevated conception. Raphael learned from Leonardo that a fully resolved composition was attained only after intensive study of the human figure.

The High Renaissance drew to a close in the 1520s with the death of Raphael and the political and social upheaval following the Sack of Rome. Raphael's gifted Roman pupils dispersed as new ideas were ripening elsewhere in Italy. In Venice, for example, brilliant painters came forward at a phenomenal rate. New subjects for painting were devised: landscapes and cityscapes, still lifes, ecstatic visions of saints, and genre scenes of everyday life. Artists greatly expanded the expressive potential of the relatively new medium of oil paint.

Subsequent generations of artists reassessed the Renaissance norm of superb drawing combined with an idealization of nature, which had been established by Leonardo, Raphael, and Michelangelo. A group of central Italian painters, including Perino del Vaga and Pontormo, devised mannerism, a style of self- consciously elegant poses and markedly unnatural colors. The style of these artists was not fully accepted by Venetian artists, however, who were more interested in effects of color, light, atmosphere, and texture: Titian instilled a new sensuality in his art, while Tintoretto's scenes are boldly sketched and highly dramatic in mood.

Later in the sixteenth century came stylistic developments that are now called the baroque. A family of artists in Bologna, the Carracci, set about reinvigorating the grand tradition of Italian painting. Their efforts to combine central-Italian skill in drawing with the lifelike warmth and coloristic richness of the Venetians led to a new synthesis of nature and the ideal. The revolutionary dramatic naturalism of the short-lived Caravaggio influenced the work of dozens of artists all over Europe.

Tuesday, November 25, 2008

How the US President is elected

The election to the world's most powerful job isn't based on the popular vote. It's a bit more complicated than in India.

This is how it works:

Basically, the ballots have Obama, McCain's names although elections are also held for the Congress simultaneously sometimes.But votes cast for Obama or McCain don't go to them directly but to the Electoral College which consists of 538 popularly elected representatives who formally select the President.

At this point it's all or nothing.

The size of the Electoral College is equal to the total membership of both Houses of Congress (435 Representatives and 100 Senators plus the three electors allocated to Washington, D.C.), totaling 538 electors.

Each state is allocated as many electors as it has Representatives and Senators in the United States Congress. Since the most populous states have the most seats in the House of Representatives, they also have the most electors.

The six states with the most electors are California (55), Texas (34), New York (31), Florida (27), Illinois (21) and Pennsylvania (21).

  • Ballots have Obama, McCain's names
  • But votes cast go to the Electoral College
  • Whoever wins most votes in a state, wins all Electoral votes
  • Whoever gets 270 Electors (out of 538), wins

    Whichever Presidential candidate wins the most votes in a state, wins all the Electoral votes, even if the popular vote was split 51-49 percent.And whoever gets 270 Electors (out of 538), wins the US Presidential election.
  • Monday, November 24, 2008

    Abrams Falls

    Although Abrams Falls is only 20 feet high, the large volume of water rushing over falls more than makes up for its lack of height. The long, deep pool at its base is very picturesque. The waterfall and creek are named for Cherokee Chief Abram or Abraham whose village once stood several miles downstream.

    The trail to the falls traverses pine-oak forest on the ridges and hemlock and rhododendron forest along the creek. The hike is 5 miles roundtrip and considered moderate in difficulty.Due to strong currents and an undertow, swimming in the pool at the base of the falls is extremely dangerous.

    Access trail: Abrams Falls
    Trailhead: The turnoff for the trailhead is located past stop #10 on the Cades Cove Loop Road. The turnoff is signed.

    Tuesday, November 18, 2008

    Scars

    Also called: Cicatrix, Keloid scar
    A scar is a permanent patch of skin that grows over a wound. It forms when your body heals itself after a cut, scrape, burn or sore. You can also get scars from surgery that cuts through the skin, from infections like chickenpox, or skin conditions like acne. Scars are often thicker, as well as pinker, redder or shinier, than the rest of your skin.

    How your scar looks depends on
    * How big and deep your wound is
    * Where it is
    * How long it takes to heal
    * Your age
    * Your inherited tendency to scar

    Scars usually fade over time but never go away completely. If the way a scar looks bothers you, various treatments might minimize it. These include surgical revision, dermabrasion, laser treatments, injections, chemical peels and cream.

    Tuesday, November 11, 2008

    Aggregate fruit

    An aggregate fruit, or etaerio, develops from a flower with numerous simple pistils. An example is the raspberry, whose simple fruits are termed drupelets because each is like a small drupe attached to the receptacle. In some bramble fruits (such as blackberry) the receptacle is elongated and part of the ripe fruit, making the blackberry an aggregate-accessory fruit. The strawberry is also an aggregate-accessory fruit, only one in which the seeds are contained in achenes.In all these examples, the fruit develops from a single flower with numerous pistils. Some kinds of aggregate fruits are called berries, yet in the botanical sense they are not.

    Wednesday, November 05, 2008

    Candidate John McCain seemed to have it all.

    Few in America did not know about his decades of service, his breath-taking heroism as a war hero in Vietnam, his foreign policy expertise and his ability to reach across the Congressional aisle.

    Mr McCain's opponent was largely untested, inexperienced and, initially at least, unknown; his race only added to his challenge.

    If there is such a thing as a perfect political storm though, John McCain found himself caught in the middle of it. In a leaky boat. With limited fuel.

    Hopes dashed

    This was another aspect of the McCain strategy that seemed to backfire. Although Mr McCain ran only 10% more purely negative adverts than his rival, according to media monitoring groups, they were more deeply personal attacks - accusing Mr Obama of having a close relationship with a "domestic terrorist", for example.Such ads created a backlash from independent voters, according to the polls, and Mr McCain was forced to change his tone.

    In fact, he could never quite find a narrative that worked. He went from being war hero, to the voice of experience, to maverick, to tax-cutter, but he never found a way to lift himself in the polls.His team hoped the three presidential debates would finally reveal their candidate to be best qualified for the job. But in the "town hall" setting Mr McCain favoured, he wandered around the stage and forgot that what may work in a real town hall doesn't necessarily work with a TV audience.In other debates he tried confronting Mr Obama, but was never able to shake the younger man's almost unnatural cool. At times, Mr McCain seemed to be trying to keep a simmering rage under control, which brought more negative coverage.

    When the credit crisis erupted and the economy stalled, it seemed a damning indictment of an era of Republican deregulation and "trickle-down" economics.Mr McCain's past quotes about the fundamentals of the economy being strong came back to haunt him. His tax plan - which seemed to favour the wealthy - rang hollow with people facing foreclosure and job losses.His abrupt suspension of his campaign to return to Washington and "fix the problem" seemed erratic and was ultimately ineffectual.In the end, he projected an image as a man from America's past, who had been through much and served his country well.

    But in a disgruntled nation, deeply disenchanted with Republicanism, he couldn't match the appeal of his younger opponent and his message of change.

    Monday, November 03, 2008

    GPS

    The Global Positioning System (GPS) is a U.S. space-based radionavigation system that provides reliable positioning, navigation, and timing services to civilian users on a continuous worldwide basis -- freely available to all. For anyone with a GPS receiver, the system will provide location and time. GPS provides accurate location and time information for an unlimited number of people in all weather, day and night, anywhere in the world.

    The GPS is made up of three parts: satellites orbiting the Earth; control and monitoring stations on Earth; and the GPS receivers owned by users. GPS satellites broadcast signals from space that are picked up and identified by GPS receivers. Each GPS receiver then provides three-dimensional location (latitude, longitude, and altitude) plus the time.

    Individuals may purchase GPS handsets that are readily available through commercial retailers. Equipped with these GPS receivers, users can accurately locate where they are and easily navigate to where they want to go, whether walking, driving, flying, or boating. GPS has become a mainstay of transportation systems worldwide, providing navigation for aviation, ground, and maritime operations. Disaster relief and emergency services depend upon GPS for location and timing capabilities in their life-saving missions. Everyday activities such as banking, mobile phone operations, and even the control of power grids, are facilitated by the accurate timing provided by GPS. Farmers, surveyors, geologists and countless others perform their work more efficiently, safely, economically, and accurately using the free and open GPS signals.

    Friday, October 31, 2008

    Indian Finance

    Finance Department largely performs the function of advising the Government on all financial matters. The formulation of the Budget is one of its most important functions. Finance department is also entrusted with the responsibility of framing rules regulating pay, emoluments and other service conditions of all Government employees. It has administrative control over the departments of Local Fund Audit, Directorates of National Savings, Lotteries, Insurance and Treasuries.

    Regulatory Function of the department is the most important. It is the nodal center for monitoring all financial transactions of the Country. It performs all the important function of budget preparation as well as monitoring the receipts and expenditure incurred during the year. Another important task of the department is to monitor the reappropriation of funds. Preparation of Rules relating to financial matters and its interpretation sought by the departments is also an important function.

    Wednesday, October 22, 2008

    Scientists: Earth May Exist in Giant Cosmic Bubble

    Earth may be trapped in an abnormal bubble of space-time that is particularly devoid of matter.Scientists say this condition could account for the apparent acceleration of the universe's expansion, for which dark energy currently is the leading explanation.

    Dark energy is the name given to the hypothetical force that could be drawing all the stuff in the universe outward at an ever-increasing rate.Current thinking is that 74 percent of the universe could be made up of this exotic dark energy, with another 21 percent being dark matter, and normal matter comprising the remaining 5 percent.

    Wednesday, October 15, 2008

    Broadband Speed Tests Questioned

    Virgin Media has criticised some broadband speed tests, saying they rely on "dirty data".

    It said current tests were often inaccurate.It is concerned that tests for 50Mbps (megabits per second) services, which are starting to launch, will be even more inaccurate.More people are using broadband speed tests to find out whether the speed they are actually getting comes close to what service providers promise.

    Error margin

    Most broadband consumers in the UK are currently using a service which offers speeds of up to 8Mbps but there are wide variations in the actual speeds they receive.Virgin Media has been testing the testers and has pinpointed some issues with such services.Online speed tests generally work by sending a file to a computer and timing how long it takes. This so-called payload is often too small, according to Virgin, to give an accurate result.The error margin is amplified when speeds get up to 50Mbps, it said.It is also concerned by the way web-based speed tests measure only how fast data is able to travel from one part of the internet to another, which is subject to bottlenecks and delays.Other factors that affect results include the number of people using the test at any given time and the processing power of individual computers.

    Too costly

    Michael Phillips, head of broadbandchoices.co.uk, said some of the issues raised by Virgin were fair.He said he would be putting some caveats on his site's speed test.But he believes that for the majority of users on lower broadband speeds, such tests remained an important barometer of services.He said that the costs involved in creating an accurate test for faster speeds may be too high for those sites that make no money from the tests and simply offer them as an additional service to consumers."It is very costly. If you host a server you have to pay for a feed to the internet and to get one that is reliable could prove prohibitive," he said.Virgin Media pledged to work with speed test providers to improve accuracy.

    Overall performance

    It recommended tests such as that devised by broadband comparison site SamKnows that uses hardware directly attached to customers' modems.The SamKnows kit has been adopted by Ofcom and attracted thousands of triallists keen to test out the system.It came about because the founders of SamKnows were themselves unhappy with the accuracy of other broadband speed tests.

    "We wanted to make it much more comprehensive, not so much about speed as overall performance," said Sam Crawford, the founder of SamKnows.

    Andrew Ferguson, head of broadband comparison site ThinkBroadband is happy his speed tester is accurate."We are confident that our speed tester is in a position to handle 50Mbps and faster broadband connections," he said.At the beginning of September 2008 the site adjusted the amount of data used during the tests to ensure reliable results were provided for fast connections."As we test ever-faster connections, we will evolve the testing procedures," he said.According to analyst firm Forrester only 12% of UK users have used a such a speed test.Despite its concerns Virgin Media appears to be performing well in such tests.

    Latest figures from independent broadband comparison site Point Topic put Virgin Media at the top of the league for delivering on its speed promises.

    Friday, October 03, 2008

    Outsourcing Advantages

    Software Outsourcing has long passed the fad or buzzword stage. It is here to stay as an IT trend which has evolved, grown, matured and is living up to and outgrowing its potential. Especially with companies that wish to cut costs while gaining access to world-class software engineers, it is no more an option but a smart decision. One of the strongest factors that attracts most of the Fortune 500 companies worldwide to the outsourcing industry is the significant savings attached to an software outsourcing project. On an average, companies report 40% to 60% increase in net savings with the help of Offshore IT Outsourcing.

    Half of all the fortune 500 companies today target offshore software development in India. The core reason for preferring India, as an offshore development partner, to other competing destinations in offshore IT Outsourcing business is a vast pool of educated human resource combined with world-class quality offerings and ever encouraging Government policies for the IT sectors.

    Wednesday, September 24, 2008

    Samsung Sticking to Microchip Investment Plans

    Despite a slump in the global semiconductor market, Samsung Electronics will stick a plan to invest W7 trillion in computer chips in the second half of this year( US$1=W1,149). Kwon Oh-hyun, the head of Samsung's semiconductor division, confirmed this at the launch of the new growing industry forum held in the National Assembly on Tuesday. This contrasts starkly with other global semiconductor companies like Hynix, Elpida and Power Chip who have decided to cut production this year due to oversupply and the recession. In March, Hynix cut investment plans for the second half of this year from W1.7 trillion to W700 billion.

    But Kwon said while the semiconductor industry is in its most serious slump ever, “the crisis offers new opportunities. We won’t be shaken by short-term factors and will stick to our original plan again next year with a long-term outlook.”

    Kwon also revealed that takeover negotiations for U.S.-based memory chip maker SanDisk are still in progress. “We are still negotiating the price, and we have finished the legal review to see if we would violate antitrust laws if we acquire SanDisk,” he said. Samsung Electronics last Wednesday officially offered to buy SanDisk for $26 per share, or $5.85 billion in total, but the SanDisk board rejected the offer.

    Monday, September 15, 2008

    Human Genome Project

    The Human Genome Project (HGP) was an international scientific research project with a primary goal to determine the sequence of chemical base pairs which make up DNA and to identify the approximately 25,000 genes of the human genome from both a physical and functional standpoint.

    The project began in 1990 initially headed by James D. Watson at the U.S. National Institutes of Health. A working draft of the genome was released in 2000 and a complete one in 2003, with further analysis still being published. A parallel project was conducted by the private company Celera Genomics. Most of the sequencing was performed in universities and research centers from the United States, Canada and Britain. The mapping of human genes is an important step in the development of medicines and other aspects of health care.

    While the objective of the Human Genome Project is to understand the genetic makeup of the human species, the project also has focused on several other nonhuman organisms such as E. coli, the fruit fly, and the laboratory mouse. It remains one of the largest single investigational projects in modern science.

    Tuesday, September 09, 2008

    Anaerobic biodegradation of pollutants

    Anaerobic microbial mineralization of recalcitrant organic pollutants is of great environmental significance and involves intriguing novel biochemical reactions. In particular, hydrocarbons and halogenated compounds have long been doubted to be degradable in the absence of oxygen, but the isolation of hitherto unknown anaerobic hydrocarbon-degrading and reductively dehalogenating bacteria during the last decades provided ultimate proof for these processes in nature. Many novel biochemical reactions were discovered enabling the respective metabolic pathways, but progress in the molecular understanding of these bacteria was rather slow, since genetic systems are not readily applicable for most of them. However, with the increasing application of genomics in the field of environmental microbiology, a new and promising perspective is now at hand to obtain molecular insights into these new metabolic properties. Several complete genome sequences were determined during the last few years from bacteria capable of anaerobic organic pollutant degradation. The ~4.7 Mb genome of the facultative denitrifying Aromatoleum aromaticum strain EbN1 was the first to be determined for an anaerobic hydrocarbon degrader (using toluene or ethylbenzene as substrates). The genome sequence revealed about two dozen gene clusters (including several paralogs) coding for a complex catabolic network for anaerobic and aerobic degradation of aromatic compounds.

    Monday, August 25, 2008

    Thermodynamics

    As all catalysts, enzymes do not alter the position of the chemical equilibrium of the reaction. Usually, in the presence of an enzyme, the reaction runs in the same direction as it would without the enzyme, just more quickly. However, in the absence of the enzyme, other possible uncatalyzed, "spontaneous" reactions might lead to different products, because in those conditions this different product is formed faster.

    Furthermore, enzymes can couple two or more reactions, so that a thermodynamically favorable reaction can be used to "drive" a thermodynamically unfavorable one. For example, the hydrolysis of ATP is often used to drive other chemical reactions.Enzymes catalyze the forward and backward reactions equally. They do not alter the equilibrium itself, but only the speed at which it is reached. For example, carbonic anhydrase catalyzes its reaction in either direction depending on the concentration of its reactants.

    Nevertheless, if the equilibrium is greatly displaced in one direction, that is, in a very exergonic reaction, the reaction is effectively irreversible. Under these conditions the enzyme will, in fact, only catalyze the reaction in the thermodynamically allowed direction.

    Thursday, August 21, 2008

    Geography Markup Language

    The Geography Markup Language (GML) is the XML grammar defined by the Open Geospatial Consortium (OGC) to express geographical features. GML serves as a modeling language for geographic systems as well as an open interchange format for geographic transactions on the Internet. Note that the concept of feature in GML is a very general one and includes not only conventional "vector" or discrete objects, but also coverages (see also GMLJP2) and sensor data. The ability to integrate all forms of geographic information is key to the utility of GML.

    Monday, August 11, 2008

    Data stream

    Data stream is about the more general meaning of the term "data stream". For the UK-specific DSL technology called "Datastream", also see the IP Stream article.

    In telecommunications and computing, a data stream is a sequence of digitally encoded coherent signals (packets of data or datapackets) used to transmit or receive information that is in transmission.

    In electronics and computer architecture, a data stream determines for which time which data item is scheduled to enter or leave which port of a systolic array, a Reconfigurable Data Path Array or similar pipe network, or other processing unit or block. Often the data stream is seen as the counterpart of an instruction stream, since the von Neumann machine is instruction-stream-driven, whereas its counterpart, the Anti machine is data-stream-driven.

    Tuesday, August 05, 2008

    Double-checked locking

    In software engineering, double-checked locking is a software design pattern also known as "double-checked locking optimization". The pattern is designed to reduce the overhead of acquiring a lock by first testing the locking criterion (the 'lock hint') in an unsafe manner; only if that succeeds does the actual lock proceed.

    The pattern, when implemented in some language/hardware combinations, can be unsafe. It can therefore sometimes be considered to be an anti-pattern.

    It is typically used to reduce locking overhead when implementing "lazy initialization" in a multi-threaded environment, especially as part of the Singleton pattern. Lazy initialization avoids initializing a value until the first time it is accessed.

    Tuesday, July 29, 2008

    Implicit Web

    The Implicit Web is a concept coined in 2007 to denote web sites which specialize in the synthesis of personal information gleaned from the Internet into a single, coherent picture of user behavior. Implicit data may include clickstream information, media consumption habits, location tracking or any data generated without "explicit" input from a user. Presumed advantages of implicit data include accuracy, ease of input and comprehensiveness.

    The term Implicit Web was popularized by the technology investors Josh Kopelman, Fred Wilson, and Brad Feld.

    Monday, July 21, 2008

    Data hierarchy

    Data Hierarchy refers to the systematic organization of data, often in a hierarchical form. Data organization involves fields, records, files and so on.

    A data field holds a single fact. Consider a date field, e.g. "September 19, 2004". This can be treated as a single date field (eg birthdate), or 3 fields, namely, month, day of month and year.

    A record is a collection of related fields. An Employee record may contain a name field(s), address fields, birthdate field and so on.

    A file is a collection of related records. If there are 100 employees, then each employee would have a record (e.g. called Employee Personal Details record) and the collection of 100 such records would constitute a file (in this case, called Employee Personal Details file).

    Files are integrated into a database. This is done using a Database Management System. If there are other facets of employee data that we wish to capture, then other files such as Employee Training History file and Employee Work History file could be created as well.

    Monday, July 14, 2008

    Relational model

    The relational model for database management is a database model based on first-order predicate logic, first formulated and proposed in 1969 by Edgar Codd.

    Its core idea is to describe a database as a collection of predicates over a finite set of predicate variables, describing constraints on the possible values and combinations of values. The content of the database at any given time is a finite model (logic) of the database, i.e. a set of relations, one per predicate variable, such that all predicates are satisfied. A request for information from the database (a database query) is also a predicate.

    The purpose of the relational model is to provide a declarative method for specifying data and queries: we directly state what information the database contains and what information we want from it, and let the database management system software take care of describing data structures for storing the data and retrieval procedures for getting queries answered.

    Tuesday, July 08, 2008

    Information ecology

    In the context of an evolving information society, the term information ecology was coined by various persons in the 1980s and 1990s. It marks a connection between ecological ideas with the dynamics and properties of the increasingly dense, complex and important digital informational environment and has been gaining progressively wider acceptance in a growing number of disciplines. "Information ecology" often is used as metaphor, viewing the informational space as an ecosystem.

    Wednesday, July 02, 2008

    Malware

    Malware is software designed to infiltrate or damage a computer system without the owner's informed consent. The term is a portmanteau of the words malicious and software. The expression is a general term used by computer professionals to mean a variety of forms of hostile, intrusive, or annoying software or program code.

    Many normal computer users are however still unfamiliar with the term, and most never use it. Instead, "computer virus" is incorrectly used in common parlance and even in the media to describe all kinds of malware, though not all malware are viruses.

    Software is considered malware based on the perceived intent of the creator rather than any particular features. Malware includes computer viruses, worms, trojan horses, most rootkits, spyware, dishonest adware, and other malicious and unwanted software. In law, malware is sometimes known as a computer contaminant, for instance in the legal codes of California, West Virginia, and several other American states.

    Monday, June 23, 2008

    Virus removal

    One possibility on Windows Me, Windows XP and Windows Vista is a tool known as System Restore, which restores the registry and critical system files to a previous checkpoint. Often a virus will cause a system to hang, and a subsequent hard reboot will render a system restore point from the same day corrupt. Restore points from previous days should work provided the virus is not designed to corrupt the restore files. Some viruses, however, disable system restore and other important tools such as Task Manager and Command Prompt. An example of a virus that does this is CiaDoor.

    Administrators have the option to disable such tools from limited users for various reasons. The virus modifies the registry to do the same, except, when the Administrator is controlling the computer, it blocks all users from accessing the tools. When an infected tool activates it gives the message "Task Manager has been disabled by your administrator.", even if the user trying to open the program is the administrator.

    Users running a Microsoft operating system can go to Microsoft's website to run a free scan, if they have their 20-digit registration number.

    Monday, June 16, 2008

    E-mail client

    An e-mail client, aka Mail User Agent (MUA), aka email reader is a frontend computer program used to manage email.

    Sometimes, the term e-mail client is also used to refer to any agent acting as a client toward an e-mail server, independently of it being a real MUA, a relaying server, or a human typing directly on a telnet terminal. In addition, a web application providing the relevant functionality is sometimes considered an email client.

    Wednesday, June 11, 2008

    Robot software

    Robot software is the coded commands that tell a mechanical device (known as a robot) what tasks to perform and control its actions. Robot software is used to perform tasks and automate tasks to be performed. Programming robots is a non-trivial task. Many software systems and frameworks have been proposed to make programming robots easier.

    Some robot software aims at developing intelligent mechanical devices. Though common in science fiction stories, such programs are yet to become common-place in reality and much development is yet required in the field of artificial intelligence before they even begin to approach the science fiction possibilities. Pre-programmed hardware may include feedback loops such that it can interact with its environment, but does not display actual intelligence.

    Currently, malicious programming of robots is of some concern, particularly in large industrial robots. The power and size of industrial robots mean they are capable of inflicting severe injury if programmed incorrectly or used in an unsafe manner. One such incident occurred on 21 July 1984 when a man was crushed to death by an industrial robot. That incident was an accident, but shows the potential risks of working with robots. In science fiction, the Three Laws of Robotics were developed for robots to obey and avoid malicious actions.

    Monday, June 02, 2008

    Behavior-based robotics

    Behavior-based robotics or behavioral robotics or behavioural robotics is the branch of robotics that incorporates modular or behavior based AI (BBAI).

    The school of behavior-based robots owes much to work undertaken in the 1980s at the Massachusetts Institute of Technology by Professor Rodney Brooks, who with students and colleagues built a series of wheeled and legged robots utilising the subsumption architecture. Brooks' papers, often written with lighthearted titles such as "Planning is just a way of avoiding figuring out what to do next", the anthropomorphic qualities of his robots, and the relatively low cost of developing such robots, popularised the behavior-based approach.

    Brooks' work builds - whether by accident or not - on two prior milestones in the behavior-based approach. In the 1950s, W. Grey Walter, an English scientist with a background in neurological research, built a pair of vacuum tube-based robots that were exhibited at the 1951 Festival of Britain, and which have simple but effective behavior-based control systems.

    The second milestone is Valentino Braitenberg's 1984 book, "Vehicles - Experiments in Synthetic Psychology" (MIT Press). He describes a series of thought experiments demonstrating how simply wired sensor/motor connections can result in some complex-appearing behaviors such as fear and love.

    Tuesday, May 27, 2008

    Ergonomics

    Ergonomics, also called "Engineering psychology" or "human factors", is the application of scientific information concerning objects, systems and environment for human use (definition adopted by the International Ergonomics Association in 2007). Ergonomics is commonly thought of as how companies design tasks and work areas to maximize the efficiency and quality of their employees’ work. However, ergonomics comes into everything which involves people. Work systems, sports and leisure, health and safety should all embody ergonomics principles if well designed.

    It is the applied science of equipment design intended to maximize productivity by reducing operator fatigue and discomfort. The field is also called biotechnology, human engineering, and human factors engineering. Ergonomic research is primarily performed by ergonomists who study human capabilities in relationship to their work demands. Information derived from ergonomists contributes to the design and evaluation of tasks, jobs, products, environments and systems in order to make them compatible with the needs, abilities and limitations of people.

    Monday, May 19, 2008

    Robot kit

    A robot kit is a special construction kit for building robots, especially autonomous mobile robots.

    Toy robot kits are also supplied by several companies. They are mostly made of plastics elements like Lego Mindstorms and the Robotis Bioloid, or aluminium elements like Lynxmotion's Servo Erector Set and the qfix kit.

    The kits can consist of: structural elements, mechanical elements, motors (or other actuators), sensors and a controller board to control the inputs and outputs of the robot. In some cases, the kits can be available without electronics as well, to provide the user the opportunity to use his or her own.

    Monday, May 12, 2008

    Software Process Innovation

    Software process innovation can take the form of the development of new techniques, tools or methods for software development, as for example with extreme programming (XP) or SCRUM. It can concentrate on one phase of a more traditional development process, such as requirements elicitation - introducing more creative or imaginative techniques or tools. Software process innovations can be user-led , where expert users collaborate in the writing of software which meets their own needs (for example the Linux community). Process innovation can also focus on market analysis:where the demand for new software products lies. Common to many software process innovations is a focus on productive work, and the avoidance of thrashing – unfocused work which is neither productive nor generating new ideas. A more modern pre-occupation is with ‘flow’ , (Csíkszentmihályi’s description of a mental state characterized by high energy and focus) in a software team

    The relationship between software process innovation and innovative software products is a complex one. At the moment there is no particular evidence that innovative software processes necessarily result in innovative software products. Some forms of innovative software products may be best developed using traditional methods .

    Tuesday, May 06, 2008

    National Transportation Research Center (NRTC)

    The National Transportation Research Center (NRTC) is an institution, located in Knoxville, Tennessee, that conducts research and development aimed at increasing the efficiency and safety of transportation systems and reducing their energy utilization and effects on the environment.

    It is operated as a partnership between the United States Department of Energy, the University of Tennessee, and Oak Ridge National Laboratory (ORNL) and is located approximately half-way between the university campus and the ORNL site in Oak Ridge.

    Monday, April 28, 2008

    Islamic Golden Age

    The Islamic Golden Age, also sometimes known as the Islamic Renaissance, is traditionally dated from the 8th century to the 13th century, though some have extended it to the 15th or 16th centuries. During this period, engineers, scholars and traders in the Islamic world contributed to the arts, agriculture, economics, industry, law, literature, navigation, philosophy, sciences, and technology, both by preserving and building upon earlier traditions and by adding inventions and innovations of their own. Howard R. Turner writes: "Muslim artists and scientists, princes and laborers together created a unique culture that has directly and indirectly influenced societies on every continent."

    Monday, April 21, 2008

    Columbian Exchange

    The Columbian Exchange has been one of the most significant events in the history of world ecology, agriculture, and culture. The term is used to describe the enormous widespread exchange of plants, animals, foods, human populations (including slaves), communicable diseases, and ideas between the Eastern and Western hemispheres that occurred after 1492. Many new and different goods were exchanged between the two hemispheres of the Earth, and it began a new revolution in the Americas and in Europe. In 1492, Christopher Columbus' first voyage launched an era of large-scale contact between the Old and the New World that resulted in this ecological revolution: hence the name "Columbian" Exchange.

    Monday, April 14, 2008

    Entity-relationship model

    An Entity-relationship model is an abstract conceptual representation of structured data. Entity-relationship modeling is a relational schema database modeling method, used in software engineering to produce a type of conceptual data model (or semantic data model) of a system, often a relational database, and its requirements in a top-down fashion. Diagrams created using this process are called entity-relationship diagrams, or ER diagrams for short. Originally proposed in 1976 by Dr. Pin-Shan (Peter) Chen many variants of the process have subsequently been devised.

    Monday, April 07, 2008

    HSQLDB

    HSQLDB is a relational database management system written in Java. It is based on Thomas Mueller's discontinued Hypersonic SQL Project.[1] He later developed H2 as a complete rewrite.

    HSQLDB is available under a BSD license.

    It has a JDBC driver and supports a rich subset of SQL-92, SQL-99, and SQL:2003 standards. It offers a fast, small (less than 100k in one version, around 600k in the standard version) database engine which offers both in-memory and disk-based tables. Embedded and server modes are available.

    Friday, April 04, 2008

    Relational model

    The relational model for database management is a database model based on predicate logic and set theory. It was first formulated and proposed in 1969 by Edgar Codd with aims that included avoiding, without loss of completeness, the need to write computer programs to express database queries and enforce database integrity constraints. "Relation" is a mathematical term for "table", and thus "relational" roughly means "based on tables". It did not originally refer to the links or "keys" between tables, contrary to popular interpretation of the name.

    Monday, March 24, 2008

    File Transfer Protocol (FTP)

    In computing, the File Transfer Protocol (FTP) is a network protocol used to transfer data from one computer to another through a network, such as over the Internet.

    FTP is a commonly used protocol for exchanging files over any TCP/IP based network to manipulate files on another computer on that network regardless of which operating systems are involved (if the computers permit FTP access). There are many existing FTP client and server programs. FTP servers can be set up anywhere between game servers, voice servers, internet hosts, and other physical servers.

    Monday, March 17, 2008

    Systems design

    If the broader topic of product development "blends the perspective of marketing, design, and manufacturing into a single approach to product development, then design is the act of taking the marketing information and creating the design of the product to be manufactured. Systems design is therefore the process of defining and developing a systems to satisfy specified requirements of the market or customer. Until the 1990s systems design had a crucial and respected role in the data processing industry. In the 1990s standardization of hardware and software resulted in the ability to build modular systems. The increasing importance of software running on generic platforms has enhanced the discipline of software engineering.

    Object-oriented analysis and design methods are becoming the most widely used methods for computer system design. The UML has become the standard language used in Object-oriented analysis and design. It is widely used for modeling software systems and is increasingly used for designing non-software systems and organizations.

    Sunday, March 09, 2008

    HDTV blur

    HDTV blur is a common term used to describe a number of different artifacts on consumer modern high definition television sets:
    The following factors are generally the primary or secondary causes of HDTV blur; in some cases more than one of these factors may be in play at the studio or receiver end of the transmission chain.
    • Pixel response time on LCD displays (blur in the color response of the active pixel)
    • Slower camera Shutter speeds common in hollywood production films (blur in the HDV content of the film)
    • Blur from eye tracking fast moving objects on sample-and-hold LCD, Plasma, or Microdisplay.
    • Resolution resampling (blur due to resizing image to fit the native resolution of the HDTV)
    • Blur due to 3:2 pulldown and/or motion-speed irregularities in framerate conversions from film to video
    • Computer generated motion blur introduced by video games

    Tuesday, March 04, 2008

    Pharmacognosy

    Pharmacognosy is the study of medicines derived from natural sources. The American Society of Pharmacognosy defines pharmacognosy as "the study of the physical, chemical, biochemical and biological properties of drugs, drug substances or potential drugs or drug substances of natural origin as well as the search for new drugs from natural sources."

    Wednesday, February 27, 2008

    Organic light-emitting diode

    An organic light-emitting diode (OLED), also Light Emitting Polymer (LEP) and Organic Electro-Luminescence (OEL), is any light-emitting diode (LED) whose emissive electroluminescent layer is composed of a film of organic compounds. The layer usually contains a polymer substance that allows suitable organic compounds to be deposited. They are deposited in rows and columns onto a flat carrier by a simple "printing" process. The resulting matrix of pixels can emit light of different colors.

    Sunday, February 24, 2008

    Neptune

    Neptune (pronounced /'n?ptju? n/[8]) is the eighth and farthest planet from the Sun in the Solar System. It is the fourth largest planet by diameter, and the third major by mass. Neptune is 17 times the mass of Earth and is somewhat more massive than its near-twin Uranus, which is 15 Earth masses and less dense. The planet is named following the Roman god of the sea. Its astronomical symbol is a stylized story of Poseidon's trident.

    Discovered on September 23, 1846, Neptune was the first planet found by mathematical prediction quite than regular observation. Unexpected changes in the orbit of Uranus led astronomers to realize the gravitational perturbation of an unknown planet. Neptune was found within a degree of the predict position. The moon Triton was found shortly thereafter, but none of the planet's other 12 moons were discovered preceding to the twentieth century. Neptune has been visit by only one spacecraft, Voyager 2, which flew by the planet on August 25, 1989.

    Friday, February 22, 2008

    Mosasaurs

    Mosasaurs (from Latin Mosa meaning the 'Meuse river' in the Netherlands, and Greek sauros meaning 'lizard') were serpentine marine reptiles. The first fossil remains were discovered in a limestone quarry at Maastricht on the Meuse about 1780. These ferocious marine predators are now considered to be the closest relatives of snakes, due to cladistic analysis of symptomatic similarities in jaw and skull anatomies.Mosasaurs were not dinosaurs but lepidosaurs, reptiles with overlapping scales.

    These predators evolved from semi-aquatic squamates known as the aigialosaurs, close relatives of modern-day monitor lizards, in the Early Cretaceous Period. During the last 20 million years of the Cretaceous Period (Turonian-Maastrichtian), with the extinction of the ichthyosaurs and pliosaurs, mosasaurs became the dominant marine predators.

    Sunday, February 17, 2008

    Synthetic biology

    The term synthetic biology has long been used to describe an approach to biology that attempts to integrate (or "synthesize") different areas of research in order to create a more holistic understanding of life. More recently the term has been used in a different way, signaling a new area of research that combines science and engineering in order to design and build ("synthesize") novel biological functions and systems. The present article discusses the term in this latter meaning.

    Saturday, February 09, 2008

    Cybernetics

    Cybernetics is the study of feedback and derived concepts such as communication and control in living organisms, machines and organisations. Its focus is how anything (digital, mechanical or biological) processes information, reacts to information, and changes or can be changed to better accomplish the first two tasks.

    The terms "systems theory" and "cybernetics" have been widely used as synonyms. Some authors use the term cybernetic systems to denote a proper subset of the class of general systems, namely those systems that include feedback loops. However Gordon Pask's differences of eternal interacting actor loops (that produce finite products) makes general systems a proper subset of cybernetics. According to Jackson (2000), Bertalanffy promoted an embryonic form of general system theory (GST) as early as the 1920s and 1930s but it was not until the early 1950s it became more widely known in scientific circles.

    Sunday, January 27, 2008

    Trotskyism

    Trotskyism is the theory of Marxism as advocated by Leon Trotsky. Trotsky considered himself a Bolshevik-Leninist, arguing for the establishment of a vanguard party. He considered himself an advocate of orthodox Marxism. His politics differed sharply from those of Stalin or Mao, most importantly in declaring the need for an international "permanent revolution". Numerous groups around the world continue to describe themselves as Trotskyist and see themselves as standing in this tradition, although they have diverse interpretations of the conclusions to be drawn from this.

    Trotsky advocated proletarian revolution as set out in his theory of "permanent revolution", and he argued that in countries where the bourgeois-democratic revolution had not triumphed already (in other words, in places that had not yet implemented a capitalist democracy, such as Russia before 1917), it was necessary that the proletariat make it permanent by carrying out the tasks of the social revolution (the "socialist" or "communist" revolution) at the same time, in an uninterrupted process.

    Thursday, January 17, 2008

    Neuropsychology

    Neuropsychology is an interdisciplinary branch of psychology and neuroscience that aims to understand how the structure and function of the brain relate to specific psychological processes and overt behaviors. The term neuropsychology has been applied to both lesion studies of humans and animals and efforts to record electrical activity from individual cells (or groups of cells) in higher primates.

    It is scientific in its approach and shares an information processing view of the mind with cognitive psychology and cognitive science.

    Thursday, January 10, 2008

    Clinical psychology

    Clinical psychology includes the study and application of psychology for the purpose of understanding, preventing, and relieving psychologically-based distress or dysfunction and to promote subjective well-being and personal development. Central to its practice are psychological assessment and psychotherapy, although clinical psychologists may also engage in research, teaching, consultation, forensic testimony, and program development and administration. Some clinical psychologists may focus on the clinical management of patients with brain injury—this area is known as clinical neuropsychology. In many countries clinical psychology is a regulated mental health profession.

    Friday, January 04, 2008

    Anti-realism

    In philosophy, the term anti-realism is used to describe any position involving either the denial of an objective reality of entities of a certain type or the denial that verification-transcendent statements about a type of entity are either true or false. This latter construal is sometimes expressed by saying "there is no fact of the matter as to whether or not P." Thus, we may speak of anti-realism with respect to other minds, the past, the future, universals, mathematical entities (such as natural numbers), moral categories, the material world, or even thought. The two construals are clearly distinct and often confused.