Archive for July, 2010
The hosting of the 2010 FIFA World Cup, which has pumped an estimated R93-billion into the local economy, has rebranded South Africa and created a favourable climate for direct foreign investment and tourism growth, says KPMG senior economist Frank Blackmore.
“One does not have to be an economist to know that things went well,” he said at a KPMG post-2010 World Cup panel discussion in Johannesburg this week.
The June 11 to July 11 sporting event contributed around 0,5% to South Africa’s yearly gross domestic product growth, and from around 4% to 6% to the country’s quarterly growth.
Currently, South Africa’s tourism rate is around 20% higher than it would normally have been for this season of the year, and it is believed that for every ten tourist coming into the country, one job opportunity is created.
A recent survey by African Response found that 96% of World Cup visitors to South Africa said that they would possibly return to the country, while 92% would recommend the country to friends and family as a holiday destination.
Murray & Roberts construction executive director Trevor Fowler, who also participated in the KPMG panel discussion, said that the money spent on infrastructure for the event had provided South Africa, as a developing country, with some “much-needed” infrastructure.
“Our roads have seen great improvement, public transport has been elevated to a level not experienced in the country before, we have built stadiums of the highest global standards and hotels, accommodation and other facilities had been constructed that can now be used by the people of South Africa, tourists entering its borders and other sporting events.
“In fact, the country had already received some enquiries from Brazil, that will be hosting the next World Cup in 2014, to assist them in their planning efforts.”
Fowler further said that the Gautrain, for instance, had far exceeded expectations. “We initially estimated that between 3 000 to 6 000 people a day would use the train, and currently we are seeing around 13 000 people using the train a day on weekdays and 20 000 people on weekends.
“This has also shown us that a significant public private partnership, such as the Gautrain project, can be executed with great success.”
Blackmore also pointed out that the almost R800-billion infrastructure “cracker” helped mitigate the effects of the global recession. “While the rest of the world were licking their financial wounds, we here in the south were beavering away.”
Meanwhile, with South Africa’s newly found confidence in hosting big sporting events, the country has indicated that it would consider putting in an official bid for the 2020 Olympic Games.
However, Fowler pointed out that this would essentially be quite different to hosting the soccer World Cup, seeing that there were a large number of different sports and events, and thousands of athletes that would have to be accommodated in one city over a three-week period.
He said that a lot of new infrastructure would be needed, such as large swimming pools and athletic tracks that would not necessarily be that easy to use after the event.
Nevertheless, Fowler said that the building of the infrastructure, especially in a developing country such as South Africa, could be beneficial.
He pointed out that government was also keen on continuous investment in the country’s infrastructure, and that a national infrastructure plan was currently in development and would be put in place by the end of the year. “This also includes things such as pipelines, undersea cables and power infrastructure.”
All in all, KPMG audit director Devon Duffield said that all the money that was spent during the 2010 World Cup was still circulating in South Africa, and that money that was circulating faster defined the creation of wealth.
The Department of Energy (DoE) is progressing with the development of the second integrated resource plan, or IRP2010, and remains on track to deliver the document by September, Energy Minister Dipuo Peters told Engineering News Online on Friday.
Peters explained that consultations have been completed with working groups, and discussions would need to be held with the InterMinisterial Committee on energy before decisions were finalised.
The document would then be sent to the Cabinet, and if accepted, only then would it be made public.
The DoE is spearheading the IRP2010, which would determine South Africa’s current and future energy requirements for the next 20 years.
Peters said previously that the country had reached a “delicate situation, which requires us to take bold and decisive decisions on whether to build coal-fired or nuclear power stations for baseload energy requirements”.
Once decisions were articulated as to what percentage of energy was to be derived from which specific resources, it was expected to accelerate investment by the private sector.
The North Gauteng High Court has ordered the Department of Home Affairs (DHA) to delay the implementation of its new work-permit regime for foreign truck drivers working for local companies, until October 28.
The DHA had begun enforcing a work permit regime for all foreign truck drivers working for local companies crossing the border into South Africa as of July 1.
Hitherto, foreign truck drivers working for South African companies could enter the country with a visitor’s visa that was valid for 30 days.
However, the local trucking industry was heavily opposed to the July implementation of the new system, saying that there was a month-long backlog in the processing of the work permits by the DHA.
A group of nearly 30 trucking companies had lodged a court case against the implementation at the beginning of July.
The case had been up for argument in the court this week, during which time legal counsel for the DHA had provided the additional information sought by the court at the previous hearing.
The group of trucking companies had accepted the court’s decision to grant a three-month interim extension for the implementation of the work permit system, Global Migration SA MD Leon Isaacson told Engineering News Online.
If the DHA had not complied with the court’s order to grant the foreign drivers access or were not processing the work permit applications in time, the trucking companies would head back to court on October 28, to ask for a further extension.
The design of and preparations for the €1,5-billion international Square Kilometre Array (SKA) project are being driven out of a suite of offices in the University of Manchester, in England. These house the SKA Programme Development Office (SPDO).
The SKA itself will be sited either in South Africa (in the Karoo, in the Northern Cape province, with outstations in some other African countries) or in Australia (with out- stations in New Zealand). The final decision on the siting of the instrument, which will be the biggest radio telescope ever built, is planned to be made in early 2012.
The University of Manchester, which is one of the world’s leading centres for radio astronomy, won the tender to host the SPDO. The university hosts the headquarters of both the world-renowned Jodrell Bank Centre for Astrophysics (JBCA) and of the UK’s Merlin long baseline interferometry (LBI) radio telescope network. All three agencies are housed in the same building.
The SKA project is currently managed by the SKA Science and Engineering Committee (SSEC), which is headed by a small executive committee. In addition, an Agencies SKA Group (ASG), composed of the national funding agencies which will pay for the SKA, concerns itself with the high-level policy issues of funding options, the site selection process and the short and long-term governance of the project.
“The SSEC is composed of leading scientists and engineers in the SKA collaboration who are responsible for the scientific and technical directions taken by the international project. The executive committee of the SSEC has teleconferences with the director of the SPDO every month,” explains SKA South Africa associate director: science and engineering, Professor Justin Jonas, who is one of South Africa’s representatives on the SSEC and a member of its executive committee. “The ASG liaises with the SSEC but is not in charge of the SSEC. In fact, neither the SSEC nor the ASG is a formally established legal entity. There is limited central SKA funding at the moment; development work is for the most part funded by institutions across the globe from local sources of money. This is expected to change within a year or two to a more centralised funding system.”
“Our office was formally set up on January 1, 2003. We now have 16 people and our job is to coordinate the international effort in developing and maintaining the science case and coordinating the systems design for the SKA and carrying out the characterisation of the two candidate sites for the SKA,” reports SPDO director Professor Richard Schilizzi. (Schilizzi was born in the UK, brought up in Australia, worked for many years in the Netherlands and recently returned to work and live in Britain.) “Our budget is currently about €700 000 a year, which comes from a common fund contributed by the SKA partner institutions and agencies around the world, plus from the European Union seventh research framework programme (or FP7) project, PrepSKA.” (PrepSKA is an acronym for the Preparatory Phase of the SKA).
“The process by which the final site decision is reached is still to be formally established. But the contours are becoming clear,” he adds. “It’ll involve a period in the latter half of this year and the first quarter of next year, in which the selection criteria are determined and the weights to be given to those criteria agreed on.” This will all have to be finished by the end of March 2011, because that is when the characterisation of the two sites (in South Africa and Australia) shortlisted for the SKA is set to be concluded.
This site characterisation – which includes determining the radio frequency interference (RFI) at both sites, using identical equipment – is being funded under FP7 PrepSKA. The RFI measuring systems are housed in a South African-designed container and include an Australian spectrometer, and have undergone acceptance trials in South Africa, with a combined team of SPDO, South African and Australian engineers.
A lot of other work is also going on. Under FP7 PrepSKA, there are seven work packages.
These are: Management, Systems Design, Site Characterisation, Governance, Procure-ment and Interactions with Industry, Funding Options and the Implementation Plan (including discussion of the nonscience benefits of the SKA).
The Governance, Procurement and Funding work packages are categorised as policy work packages and are led by funding agencies in individual European countries – Governance by the Netherlands, Procurement by Italy and Funding by the UK. For example, under the Governance work package there is a working group looking at potential legal entities to own and operate the SKA. There is currently a partnership agreement between the SKA programme members but this runs out next year, and could either be renewed or replaced.
“We’re certainly making good progress. We’re working to a schedule partly dictated by the FP7 funding from Brussels,” assures Schilizzi. “Just recently, we agreed on the baseline design for phase one. We’re about to start looking at the configuration of the telescope – how we would put down the SKA at either site. We plan to have a costed system design for the telescope by the end of 2012.”
Both South Africa and Australia are developing precursor telescopes for the SKA. Both will take the form of arrays of dish antennas. (It should be noted that the selection of the site for the SKA, and selection of the technologies to be used in the instrument, are completely separate issues.)
The South African project is designated MeerKAT and comprises up to 80 dishes and is expected to start doing science in 2013. Its prototype, the seven-dish MeerKAT Precursor Array (MPA – also known as KAT-7), in the Karoo, is now operational for engineering tests.
The Australian project is named Askap (Australian SKA Pathfinder) and will comprise 36 dishes. The first of these is now operational, and Askap is also expected to be finished and fully operational in 2013. Both these instruments will provide input for the final design of the SKA. However, the SKA will not merely be a very large MeerKAT or Askap.
The latest baseline design for the SKA involves three compact arrays, or cores, each containing a separate array. One would be an array of dish antennas and the other two would be aperture arrays, consisting of many small ‘elemental’ antennas, arranged in groups of tens, hundreds or even thousands of elements. One of the aperture arrays would be optimised for long wavelengths (this being the sparse aperture array) and the other for shorter wavelengths (the dense aperture array). The aperture arrays are fixed in position but can be steered electronically to point at different parts of the sky and can even point in more than one direction simultaneously. (Previously, the concept had been a single core with planar arrays in the centre, surrounded by a broad ring of dishes.)
“Each of these core arrays will be about 5 km in diameter,” states SPDO international project engineer Professor Peter Dewdney, a Canadian. “This configuration was formally adopted only recently by the SSEC. Each core will house a different technology . . . because each technology covers a different range of wavelengths (and, so, frequencies). For the SKA, we’re talking about wavelengths of about 4 m to about 3 cm – it could be 2 cm. We call the antennas ‘receptors’ and no single receptor design can cover this entire range of wavelengths. But covering all these wavelengths will allow the SKA to examine a very wide range of celestial objects and phenomena.”
In the sparse aperture array, the antennas will take the form of two crossed dipoles. These – and other planar antennas – have very small electrical voltages induced in them by the incoming radio waves. The signals from the array elements are combined digitally to produce a usable signal. Recording, pro- cessing and analysing these signals provides information about the source of the radio waves. The sparse aperture array will cover the wavelengths ranging from 4 m to 1 m, and prototypes of these antennas are expected to start operation in 2012 or 2013.
The dish array will cover the wavelength range from 1 m to 3 cm or even 2 cm, if the dishes can be manufactured accurately enough. “We have avant garde dish designs – avant garde in several respects, from both the manufacturing side and the electronics side,” highlights Dewdney. “But the principle of all dish designs is the same – they’re basically radio mirrors which concentrate incoming radio waves at a point we call the focus of the dish.”
Regarding manufacturing, radio telescope dishes have traditionally been made from metal, but, for the SKA, composite dishes have been developed and are being tested. “Composite dishes are being pioneered simultaneously and separately by South Africa and Canada,” he highlights. “The scale and accuracy involved in the production of composite dishes are unprecedented.”
Canada is using carbon fibre to produce both the dish and its support structure, while
South Africa’s composite dishes (which do not use carbon fibre) use both metal and composite support structures. To date, Canada has built one 10-m-diameter composite dish, while South Africa has built eight – the single 15-m Experiment Demonstrator Model (XDM), at the Hartebeesthoek Radio Astronomy Observatory, in Gauteng province, west of Pretoria, and the MPA’s seven 12-m dishes.
But composite dishes do not reflect radio waves by themselves, so they must be fitted with a metallic radio reflecting surface in order to work. “The XDM dish uses a flame-sprayed aluminium surface. A layer of aluminium powder was sprayed on to the mould and the composite material and laid on top,” explains SKA South Africa (SKA SA) associate director Anita Loots. “This is quite labour intensive but it gives the dish a solid reflective surface, which means no avoidable restrictions on the wavelengths it can receive.”
The Canadian and the seven South African MPA dishes, however, have metallic meshes embedded in them to reflect radio waves. This approach is easier and cheaper than that used for the XDM. Unfortunately, the size and shape of the mesh limits the radio frequencies the dishes can receive. So it is essential to determine the optimum mesh design to achieve maximum dish performance. Thus, the first three South African MPA dishes each have a different design of mesh embedded in them, while the Canadian dish uses a fourth design.
“There is quite a lot of research and develop- ment (R&D) still to be done on the dishes,” says Loots. “Only when we have completed the test programme will a decision be made on which approach to use for MeerKAT. But ours (the MPA) is the first composite array in the world that actually works.”
The Canadian dish is likely to be used in the SKA Dish Verification Programme led by the US Technology Development Programme (TDP). The TDP antenna is expected to be a prototype for the SKA dishes and will be built and tested in the US.
“The composites industry is developing rapidly,” points out Dewdney. “New composites are developed all the time. Strength to weight ratios are improving, costs are coming down and the chemistry is improving.”
For Askap, the Australians plan to use 12-m-diameter metal dishes, saving on costs by having them manufactured in China. All the dish designs proposed for the SKA will be examined in the Dish Verification Programme, the outcome of which will determine which design is adopted for the SKA.
Whatever dish design is chosen, it is intended that they will be capable of accommodating phased array feeds. “Like the aperture arrays, phased array feeds will be able to observe in several directions at the same time, like a multipixel camera, whereas traditional dishes can look in only one direction at a time with the equivalent of a single pixel,” explains Dewdney. “Phased array feed technology is being pioneered in Australia, Canada, the Netherlands and the US. A group in the Netherlands has actually demonstrated astronomical capability with phased array feeds using a prototype on the Westerbork Synthesis Radio Telescope.”
Dense aperture arrays are more complicated to develop than sparse arrays because the electrical voltages induced in their antennas interact with each other because they are closely packed together. Characterisation of this behaviour and its calibration will determine the performance of the dense array.
“Alternative technologies for the dense aperture arrays are being developed in the UK and the Netherlands. While work on sparse aperture arrays is pretty much global, dense aperture development work is con- centrated in Europe,” he states. (For infor- mation on the British development work, see Engineering News, July 9, 2010.)
For all the different types of antennas, there are two primary parameters that have to be met: the best possible efficiency in trans- ferring the incoming radio waves to electrical signals, and low instrumental noise (inter- ference). Digital systems, as will be used in the SKA, generate significant amounts of radio frequency interference. It is of paramount importance that measures are taken to contain this interference and not allow it to feed back into the antennas and dwarf the signals which the telescope seeks to detect.
It is expected that R&D on the dense aperture array and the phased array feeds for dish antennas will advance sufficiently to allow a decision to be made on their use during 2016.
MeerKAT, Askap and (hopefully) the TDP are not the only SKA precursors/pathfinders, nor are all SKA precursors/pathfinders concerned mainly with antenna and signal processing technology. In the UK, a major upgrade of the country’s Merlin (Multi-Element Radio Linked Interferometer Network) LBI network will also contribute to development of key SKA technology.
Interferometry involves using two or more radio telescope dishes (or other antennas) to look at the same object in the sky. The signals received by each dish are fed into a com- puter and, because the dishes are not in exactly the same place (even if they they are only a few tens of metres apart), the distance travelled by the signals to each is not identical and combining them creates an interference pattern that can be analysed by computer to provide high-resolution images of celestial objects.
The distance between the dishes is called the baseline. With long baseline interferometry, this baseline can be hundreds of kilometres long. With very long baseline interferometry (VLBI), the baseline can reach thousands of kilometres long. VLBI can involve radio tele-scopes in different continents. It can even involve radio telescopes on earth and in space.
Merlin is a permanent LBI network involving seven radio telescopes at six locations across England. The Merlin radio telescopes are also frequent participants in European VLBI network operations, where the baselines stretch across Europe.
Merlin’s heart is the Jodrell Bank observatory, in Cheshire, which has two of the network’s radio telescopes – the 53 year old, 76-m-diameter, 3 500-t mass Lovell (also known as the Mark 1A) instrument, which is the world’s third-largest fully steerable radio telescope, and the elliptical Mk 2 dish, which measures about 32 m along its main axis. The Jodrell Bank observatory is also the location of the Merlin correlator system. (A third, small, radio telescope at Jodrell Bank is not part of Merlin). The other five instruments in Merlin are at (from west to east) Knockin, Darnhall, Pickmere, Defford and Cambridge.
“The quality of the results in LBI is very highly dependent on the phase coherency of the individual elements in the network – every telescope has to observe the same source with the same timing and frequency phase reference,” explains JBCA digital systems engineer Chris Shenton. (He is also the digital systems engineer for UK PrepSKA). “To achieve this, we distribute a centralised reference frequency from Jodrell Bank, which is generated by a maser.” ‘Maser’ is an acronym for Microwave Amplification by Stimulated Emission of Radiation, and a maser produces a narrow beam of monochromatic coherent radiation.
Originally, in Merlin, this reference frequency was distributed to the other radio telescopes, and analogue signals from each telescope were transmitted back to Jodrell Bank through a UK-wide network of microwave links. This microwave network is now being replaced by a fibre-optic network in an upgrade called e-Merlin.
“With the Local Oscillator distributed over the fibre-optic network, we can achieve a very precise wide area timing distribution, with picosecond (0,000 000 000 001 of a second) resolution,” reports Shenton. “This is done using a centralised timing reference and local timing facilities. The central reference is used to periodically recalibrate the local timing references. e-Merlin will show one possible solution to the problem of how to carry out wide area timing and frequency phase distribution in the SKA.” This will be essential for the SKA, with its thousands of antennas and its tens of outstations.
So far, four of the Merlin radio telescopes have been linked by fibre optic cables, two of which are now also capable of receiving their timing via fibre. “We’ve already been running SKA-related experiments,” he highlights. “With the Darnhall radio telescope, which is some ten miles (17 km) south-east of Jodrell Bank, we successfully replaced the microwave link mechanism for transferring the reference phase with the fibre-optical link and demonstrated that the quality of the new link is both far superior and much more robust than the microwave link.”
Another experiment saw the successful testing last year of a small-scale direct digital receiver, which directly converted mid- frequency radio waves into digital signals. Currently, at many radio telescopes, high- frequency signals received are converted into a lower intermediate frequency (IF) before they undergo digital conversion. “We want to eliminate IF and have direct conversion to digital, because using IF has cost and complexity implications,” says Shenton, “We’re trying to simplify the system.”
South African vehicle and component manufacturers exporting to Europe are likely to face tough conditions until 2012, as mounting government debt and budget deficits continue to dampen demand on the continent.
One such manufacturer is Toyota South Africa Motors (TSAM), which exports its Corolla to the continent, with current production at 150 vehicles a day, far short of the 440-units-a-day installed capacity at the Durban plant.
Local component manufacturers exporting to Europe are also feeling the pinch.
“Our biggest export market is Europe, and European markets are signalling distress in terms of car sales for 2010 and 2011. They paint a rather bleak picture –which isn’t good news for the component sector,” says National Association of Automotive Component and Allied Manufacturers (Naacam) executive director Roger Pitot.
More than 50% of South Africa’s component exports go towards vehicle assembly in Europe.
Based in the UK, PricewaterhouseCoopers (PWC) Autofacts senior analyst Michael Gartside tells Engineering News Online that “the first decent year of recovery [in Europe] will be 2012”.
He says PWC was forced to revise its original forecast that next year would be the year of recovery in terms of new passenger car sales in the European Union (EU).
Gartside says he expects sales in the 30 European countries researched to reach 13,4-million cars in 2010, down from 14,46-million in 2009, increasing marginally to 13,6-million units in 2011, and then finally improving to 14,4-million cars in 2012.
Did the government scrappage schemes – where some European government last year offered incentives to consumers to stimulate demand for new cars – assist in creating the current problem, as had been anticipated widely?
No, says Gartside.
“A lot of people believe it simply brought forward demand, but we think the incentives brought new buyers into the market.”
He notes that Germany, for example, heavily incentivised the sale of new vehicles, making it affordable for someone who could never before own a vehicle to go out and become a car owner.
In general, PWC’s research shows that the hangover from the various scrappage schemes has been less dramatic than many anticipated, with growing evidence of stronger underlying demand.
March and April sales in Germany were just 7,5% lower than the same period in 2008, before the crisis ensued.
Moreover, many markets not distorted by scrappage schemes have shown strong recovery. Belgium, Finland, the Netherlands, Norway and Sweden have seen demand grow by a collective 21% January through to May.
Instead of scrappage schemes, the new threat to vehicle sales rather comes from rising government debt and the measures taken to reduce this debt.
Measures in Greece, Ireland, Portugal, Romania and Spain may have implications for new car demand as most include a public sector wage freeze, while in Spain, they include a 5% pay cut in 2010, followed by a freeze in 2011, with VAT to also increase this year.
In Greece the impact is evidenced by vehicle registrations falling 54% in May.
Overall, EU risk factors are now weighted on the downside, says Gartside, and, hence, the expectation of a slowing recovery.
He adds that the recession, the oil price spike in 2008, combined with the number of first-time buyers entering the market on the back of the various scrappage schemes, have all served to change the structure of the European market.
Gartside says that sales of multipurpose and sports-utility vehicles have fallen significantly, with the sales of small cars rising sharply.
The small car sector grew from 37% of the car market in Europe in 2007, to 45% in 2009.
However, Gartside says that this trend may cool down in 2010 as scrappage schemes fall away.
Valorie and Jeff Elkin of Houston will endow three scholarships totaling $75,000 at Texas A&M University.
The scholarships, to be funded through multi-year pledges to the Texas A&M Foundation, will be awarded to junior- or senior-level students in computer science and engineering, economics and petroleum engineering.
“My wife and I have a long connection to the university. Valorie’s father, David Ray Howell, Class of ’48, was commander of the combined bands. I am a petroleum engineer and recruited for many years from several departments on campus. Our kids graduated from the economics and the computer science and engineering departments,” said Jeff Elkin, president and co-founder of Empresa Energy LP.
“Watching our children pass through their respective departments reminded me how important outside support is for students. There is limited support for hard-working students who excel in class but do not meet traditional financial aid scenarios,” he said.
Students competing for the Elkin scholarship in economics will be Texas residents enrolled in upper level classes with a 3.0 grade point average and only one-fifth of their tuition costs covered by financial assistance.
“Helping students meet the costs of their education is not the only benefit of a scholarship,” said Ben M. Crouch, interim dean of liberal arts. “Mr. Elkin’s scholarship is also an investment in the future of each student recipient. Students understand this and respond accordingly.”
Recipients of the Elkin scholarships in computer science and engineering and in petroleum engineering will be upperclassmen enrolled in honors level courses in their respective departments and not currently receiving scholarship support for more than 20 percent of tuition costs. In addition, petroleum engineering recipients will be members of the Society of Petroleum Engineers student chapter and maintain a minimum 2.5 grade point average.
“The computer science and computer engineering programs are experiencing an increase in demand from very bright students. The Elkin scholarship is critical in helping us to attract and retain these top students,” said Valerie E. Taylor, computer science and engineering department head and Royce E. Wisenbaker Professor.
“We continue to have strong demand from individuals of high caliber to be part of the Texas A&M petroleum engineering program. There is a definite need for the Elkin endowment to reward and retain these high-achieving students,” said Stephen A. Holditch, petroleum engineering department head and Samuel Roberts Noble Foundation Endowed Chair Professor.
Jeff Elkin is a Texas A&M Class of 1980 petroleum engineering graduate. As an undergraduate he participated in the Society of Petroleum Engineers student chapter.
Elkin’s three decades in the petroleum engineering industry included positions with Ocean Energy, Seagull Midcon Inc., Amoco and other independent companies. In 2004 he co-founded Empresa Energy, a Houston-based private oil and gas exploitation and development company.
Valorie Howell Elkin, Texas A&M Class of 1980, earned a degree in educational curriculum and instruction. She and her husband met in the Commons dormitory during their freshman year.
The Elkins are 18-year Century Club members, currently at the Diamond level, of the Association of Former Students, and Jeff is a past president of the Amarillo Texas A&M Club. Son Joshua Elkin, Class of 2004, majored in computer engineering and computer science; and daughter Jennifer Elkin Bombulie, Class of 2008, in economics. Both children graduated with honors.
“The Elkins are an all-Aggie family, and they truly love this university. Their generous gift of multiple scholarships will provide vital support for generations of students striving to become Aggie graduates,” said Brady Bullard, director of development for engineering with the Texas A&M Foundation.
The Department of Computer Science and Engineering and the Harold Vance Department of Petroleum Engineering are two of 12 departments in the Dwight Look College of Engineering. The college ranks 9th in undergraduate studies and 8th in graduate programs in public institutions in the nation, according to U.S. News & World Report.
The Department of Economics is one of 12 in the College of Liberal Arts. The department offers B.A. and B.S. degrees that prepare students to apply the principles and theories of natural science to the concepts and logic of mathematics.
Written by Betsy Ellison
The term dedicated server refers to an advanced form of web hosting in which the customer rents, and has complete control over, an entire server. Internet connectivity is provided to the server, in many cases over 10 or 100 Mbit/s Ethernet. Dedicated servers are most often housed in data centers, similar to colocation facilities, providing redundant power sources and HVAC systems.
The term dedicated server is also sometimes used to refer to a game server for games such as Counter-Strike in which a computer runs only the server portion of the game; the game is not played on the same computer as the server. This typically results in the server being able to handle several more concurrent connections, as the computer is not bogged down with tasks such as the rendering of graphics.
A dedicated server usually refers to the actual computer that your hosting is on. A dedicated server refers to when you actually rent an entire comptuer for yourself and don't share the CPU, memory, and so on with anyone else.
With shared hosting you are sharing one computer, or server, with many other people who also have web sites and are sharing all the resources that that one computer has with other people.
The biggest advantage of a dedicated server is the opportunity to manage all your hosting needs, including hardware and operating system requirements.Note the dedicated server's operating system support. A dedicated server may have an exclusive operating system, depending on the Web developer's choice.Do the required maintenance. A dedicated server needs administrative maintenance, including upgrades, daemon updates and security patches.Add other special features to the server. These features often include serial console, remote backup space, or an automated restore of the operating system.
For dedicated server packages click here
HostingPalace has the innovative webhosting technology to implement in the web hosting domestic market. The web hosting Panel provided by HostingPalace as control panel of your domain is one among the best in the current market. It is your domain control panel, from where you manage all aspects of your domain and its contents. The domain control panel provided by web hosting company has been designed with the intention of making it easier for an individual to even act as a domain registrar possessing an authority to register a domain for self or for its clients and every now and then on the basis of requirement he can modify the web hosting account with every new update hence the domain resellers can benefit from such technology with the authority to register domain for its clients. It has become more user-friendly and more reliable.
When you access your web hosting account, everything you need is available right there in hosting panel or domain control panel itself.
The main tools available within your panels let you do the basic domain and webspace administration required to keep your website in order. You can set or reset your login details, ftp details and email accounts from web hosting control panel. You can access and maintain all your databases from your web hosting control panel as well, review basic statistics of your website, check your bandwidth use, check which scripts are supported, block certain IP addresses(depends on the web hosting package terms) from accessing your website, check for and clear up viruses, make a backup of your entire site, and other general maintenance actions, or grooming, of your domain.
If your web hosting plan allows it, you can actually set up different domains within your single account and control them all through your hosting or domain control panel.
Within your webhosting panel, you will more often than not find a handy little extra applications called file manager. It is what made easier for a client to deploy website files in webspace without taking the help of ftp account.Inbuilt feature of webhosting panel helps doing so, and this brilliant tool really comes as a handy element for hosting resellers who in this case every now and then need not have to memorize or search for ftp login information for different domains of its clients . It is not restricted to limited upload or download. One can deploy unlimited files in its webspace using the feature however restricted to limited upload in some online software at one time due to unavailability to browse for unlimited files. Bandwidth doesnt get much affected with such move and unlimited upload and download can easily be taken into process(incase the package has unlimited webspace and bandwidth facility).
Some hosting panel has the feature of adding java applications separately to its webspace package. As java is an important and widely used application most of the web hosting companies will make it sure the compatible features to enable the java application resides in the online software or control panel.
Many web hosting companies have added online shopping application in its webspace packages which helps clients to add up the application to its website at ease rather swaying the process of manipulation and editing. These comes free nowadays with web hosting packages. Most of the web hosting companies have included such applications for free in its web hosting package to let their client get benefited for hosting web applications.
HostingPalace has started providing free search engine submission for its 2 years old clients.Based on the demand of our clients which they are looking to get for free along with their web hosting i.e webspace packages HostingPalace has stated implementing this technique to help our clients to submit their website URL for free in over 8 lakh search engines.
A dedicated web hosting service, dedicated web server, or managed web hosting service all are the parts of the Internet web hosting where the client are offered an entire server not shared to anyone. The dedicated server is more flexible than shared hosting, because the dedicated server who owned it has full control over the server(s), including having an option to choose operating system, hardware, many more functionality. It is often that server admin can be granted by the web hosting organization as an add-on service. In some services of dedicated server may offer less overhead and a more return on speculation. Dedicated servers are generally homed in datacenters, similar to colcolation facilities, providing superfluous power sources and HVAC systems. In contrast to colcolation, the server hardware is owned by the provider and in some cases they will provide support for your operating system or applications.
A dedicated web server hosting company’s offer dedicated server and also take responsibilities for maintenance and backups while providing all types of security required to your web sites, power management, and many more aspects of maintaining a datacenter. The website professional developers have all the responsible for all server software issues raised.
The demand of dedicated web hosting server arises due to the increase in demand of the site and more advancement in side the website to add more services and too reach more clients which needs more advanced technologies with widespread utilization of system resources and increased bandwidth to serve the technology to a web browser more efficiently. With this augmented require for resources, having a whole server for your site is often the answer. A dedicated web hosting server is also an option for reseller hosting businesses.
A dedicated server is the most excellent solution for dealing with traffic-heavy businesses that need more staff, resources, and security to build, install, and maintain an in-house solution. With a dedicated server, leasing saves on network administrator position for a company. A dedicated web hosting account is usually cheaper overall than an in-house solution, and businesses can pick savings of up more than 80% on a per month basis.
There is a kind of web hosting web hosting info available that is called dedicated IP web hosting. This is also sometimes called static web hosting. This type of hosting provides a unique IP address that is used exclusively for each individual server space or domain. An individual browsing the Internet can access this information held in any one of these servers by going onto a website with a certain domain name or IP address.
There will be differences in these windows hosting web hosting accounts depending on whether the IP address is shared or unique. A unique IP address can be bought simply by placing an order with any of the multitude service providers that are available. These web hosting companies will provide a unique IP address after full payment has been made. These websites allow the customer to upload content and other files to their own personal, secure web space.
These types of dedicated web hosting accounts are ideal for larger businesses or for e-commerce websites that need to be certain that their site offers the most in security. Dedicated IP web hosting is windows hosting also ideal for individuals that want to buy the unique IP addresses for a premium price. Some individuals may find that they want to go this route due to the fact that it will prevent the website from collecting spam, a problem that can often occur when using a shared website
Another advantage to using a dedicated IP web hosting service is that it provides for greater flexibility while also allowing for more complex hosting. Two advantages of this are that they include a private SSL certificate and an anonymous FTP. An SSL certificate provides for online businesses that need to make sure that their customers will have secure transactions. The advantage of an anonymous FTP allows people to share information on the Internet. This will allow anybody who is using the Internet to go to that company's website and access a public directory by using FTP software.
By gaining a dedicated IP web hosting account and securing a unique IP address, it allows the business owner to have full control over their windows hosting website. By obtaining a dedicated IP address, it allows individuals or business owners to get the most benefit out of their website. This will result in better search engine results, which are much sought after by businesses.
Dedicated IP addresses are highly valued and are considered as prime Internet real estate. Because of this, there is sometimes high service fees associated with this kind of service. It is possible however to find a web hosting company that will offer dedicated IP addresses for a very reasonable price. If a company thinks that they will benefit the most from dedicated IP addresses, it's very important that research be done on many different companies. Besides the differences in price that all companies are bound to have, there will also be differences that are dependant on the company's needs and goals.
There is a different type of web hosting offered which is known as dedicated IP web hosting. This is also on occasion known static web hosting. This kind of hosting offers a unique IP address which is utilized completely for each person server space or domain. A person browsing the Internet has sole permission to access this information detained in any one of these servers through going on top of a website with a convinced domain name or IP address.
There will be distinction in these web hosting accounts which depends on either the IP address is shared or unique. A unique IP address can be acquire just through putting an order with any of the massive amount service suppliers that are obtainable. These web hosting companies will offer a unique IP address after full payment has been complete. These websites permit the buyer to upload content and all other files to their own delicate, protected web space.
These kinds of web hosting accounts are perfect for larger businesses or for e-commerce websites which require being confident that their site recommends the most in security. Dedicated IP web hosting is perfect for persons who desire to purchase the unique IP addresses for a finest value. A number of persons may discover that they desire to go this route owed to the reality that it will avoid the website from collecting spam, a crisis that can frequently occur when utilizing a shared website.
One more benefit of utilizing a dedicated IP web hosting service is that it offers for larger litheness while also permitting for more composite hosting. Two benefits of this are that they comprise a personal SSL certificate and an unidentified FTP. An SSL certificate offers for online business which requires making sure that their clients will have secure transactions. The benefit of an unidentified FTP permits people to share information on the Internet. This will permit anyone who is utilizing the Internet to search to particular company’s website and access a free directory by utilizing FTP software.
Through ahead a dedicated IP web hosting account and protecting a unique IP address, it permits the business owner to have full managed over their website. By gaining a dedicated IP address that permits persons or business proprietors to obtain the most advantage of the website. This will outcome in enhanced search engine results, which are much required after by businesses.
The CentOS is a selling venture class based on Linux Operating System (OS). CentOS has number of benefits in contrast to the other Linux clone projects including an lively and growing user society, an widespread mirror network, developers who are accessible and multiple free support streets.
As name CentOS describes as Community ENTerprise Operating System. CentOS is a free Linux version based off the original RedHat Enterprise code (REHL) which is very simple to use Linux based operating system that has outstanding control panel and 3rd party software support (offering 32bit and 64bit support).
CentOS is free and kowtows entirely too upstream vendors’ redistribution policies and aims to be 100% binary compatible. (CentOS mainly changes packages to remove upstream vendor branding and artwork.)
CentOS is developed by a small but growing team of core developers. In turn the core developers are supported by an active user community including system administrators, network administrators, managers, enterprise users, core Linux contributors and Linux enthusiasts from around the world
There has been no industry standards have been put to delineate the role of the management of dedicated server suppliers. This means that the dedicated web hosting service provider will follow the industry standard but they define the packages according to their standard so their many ranges are available. For some dedicated server suppliers, fully managed is defined as having a web based control panel while other suppliers define it as having dedicated system engineers readily available to handle all server and network related functions of the dedicated server supplier.
Some important Server management services comprise some or all of the following:
Operating system updates
Simple Network Management Protocol hardware monitoring
Time bound backups and restoration
Natural calamity recovery
DNS hosting service
Software installation and configuration
Dedicated web hosting server providers describe their level of management based on the services they offer. In contrast, completely managed could equal self managed from supplier to supplier.
Administrative continuance of the operating system, often compromises upgrades, security patches, and from time to time even daemon updates are compromised. Differing types of management may compromise adding new domains, users, daemon configuration, or even custom programming.
Dedicated web hosting server hosting providers may offer the following different types of server managed support:
Fully Managed; It compromises monitoring, updating of software, reboots, security patches and operating system upgrades. Customers are free from any work to do.
Managed It compromises medium level of management, monitoring, updates, and a very limited amount of support. Customers himself carry out specific tasks.
Self Managed It compromises usual monitoring and some maintenance. Customers provide most operations and tasks to dedicated server service provider.
Unmanaged In this no involvement of service supplier. Customers provide all maintenance, upgrades, patches, and security.
When renting server gap from a host, you essentially web hosting have two options – to rent a dedicated server or to rent shared server space. By means of a shared hosting arrangement, your web-site shares server space with other web-sites. If you choosing for rent a dedicated server then you get a complete server and network connection to yourself.
Shared web hosting servers are less expensive to rent in comparison to dedicated servers. They typically need a lower level of technical abilities too, as the host does a large amount of the server administration. This is why shared servers are generally the best option for entry-level web-sites or for small organization whose web-sites do not have too much traffic levels.
As shared web hosting servers are the mainly cost-effective choice for small web-sites, they are not unavoidably a good alternative for large, “mission-critical” or high-traffic web-sites. For these a dedicated server may well be necessary.
Dedicated web hosting servers are more exclusive to rent than shared servers, and they also need a higher level of technical knowledge to operate. Though, if you are creating thousands of a day from e-commerce and your organization would fail if the server went down for a day or more, then you should critically think renting a dedicated server.
If you are currently engaged in any facet of ecommerce, even service sectors, a website makes up a great deal of your business. Therefore, it is a safe assumption that you pay for hosting your website in some fashion. If you do not already have a dedicated server, perhaps you should revisit the decision for the best hosting options for your business.
Most web hosting companies set up accounts or on a shared server Linux hosting. You essentially share the total hard drive and bandwidth allowance with many others. This may not be the best hosting option and can present many problems such as security and traffic bottlenecks. On a dedicated server, the server is completely yours, and there are not other websites utilizing the same machine. The server is dedicated completely to you and your business
Several advantages can make a dedicated server the best Linux hosting choice for you. These include:
Server Security -Dedicated servers increase the security of your website tremendously. There are no other webmasters using the same workspace, and simple mistakes or user error that might occur due to shared machines simply no longer existent.
As the entire server is dedicated to a single customer, there is tremendously more storage space available for website pages, images, and features.
As with storage space, there is a great deal more bandwidth available for data transfer. Traffic to your site no longer competes with traffic for other websites reducing bottlenecks and slow server response time.
Having your own server offers additional opportunities for control. Sharing a server indicates that you have only limited control of server features and functions, but with a dedicated server, webmasters have greater control and access to the day-to-day functions of the host.
Dedicated servers also allow more software and script options. The server has greater storage capacity for this information, and there is no need to align coding or features with other users of the same machine.
The users genuine vigilant of time to transfer into a dedicated server contain numerous variables and required him/her to offset a diverse types of questions. Despite of the reality that dedicated servers are highest since user and his/her business will be exploiting them; they are costly as well as require somebody to handle them, which increase the expenditure of the businesses web hosting. One should calculate all the problems before walking to move a step before.
A chief and important question happens is whether or not the dedicated server chosen by the user is reasonable or not. One is required to carry out some sort of calculation to watch the plan of how much a dedicated server costs, and also to consider how much the user is required to pay the charge for an admin or quite can talk for a completely managed dedicated server. These can cost high, mainly if the user do not join a large budget. One should be sure that his/her supports permit a dedicated server before searching for it.
Secondly to consider is that is the user requires a managed dedicated server or unmanaged dedicated server. If an individual is well skilled web hosting with knowledge and identifies how to mange it by his/her own, in that case he/she can decide for unmanaged server, although if it is not able to mange him/herself then managed server is the best choice. One should keep it in mind early before mourning owed to over budget because that's too late.
The most important to anxiety that is the user really in need of space and bandwidth that an individual server provides. In a condition at the time of loading web pages web hosting quite quickly and the user immobile possess a plenty amount of space on a shared server; then the user should think about why he/she require to spend in a dedicated server. If space and bandwidth are in required then one should do it, although if user don't have any sensitive information then it is useless to misuse wealth.
The question arises frequently, why persons wish dedicated servers? The answer is that they only clutch that individual’s information, in its place of diversity of persons and businesses. So, if a user possesses particularly sensitive information he/she cannot take the possibility to have danger on sensitive data. Conceivably it may get wrapped; a dedicated server is an ideal choice in this case.
If the users have his/her own dedicated server then it indicates that he/she extremely got the identification of different tools and alternatives on the fingertips which he/she can’t get if he/she is not having shared server. There are no limits with a web hosting dedicated server since it is absolutely of user, so the user can do anything he/she desires to do. The only factor that the user requires to judge is that he/she should not disregard the increasing budget size.