Digital Cameras



Sales of digital cameras seem to be exploding. This is not surprising as the quality of images from digital cameras improves and they become more affordable. In addition, consumers are becoming more comfortable with the idea of digital photography.There are so many different types of digital cameras available choosing one which is best for your needs can become a very confusing decision. There are several things to consider when choosing a digital camera.

Types of Digital Camera.

There are basically three types of cameras whether you care shooting film or digital. Point and Shoot cameras, Prosumer cameras and professional quality cameras.

1. Point and Shoot cameras are fully automatic. They do everything for you. The camera choosing the correct exposure and whether a flash is needed. The photographer only needs to point and shoot.
2. Prosumer Cameras. These cameras are a step up from point and shoot cameras and allow the user to either shoot in fully automatic mode or to have some control over the exposure by using specific exposure modes. For example, there might be a portrait mode, an action mode, and a close up mode.
3. Professional Cameras. These cameras allow the photographer to actually look through the lens. This means that what you see is what you get. These cameras also provide complete control over the exposure. They have fully automatic mode, specific exposure modes as well as a fully manual mode.

Resolution

Regardless of what type of camera you decide to get you will also have to make the decision about the resolution of the camera.
If you have shopped at one of those electronics superstores you will be lead to believe that the most important thing to consider when purchasing digital cameras is the number of pixels. Although it is important there is much more to deciding on a digital camera than pixels. Pixels are tiny squares, and in new models other shapes, that make up the image. Obviously the more pixels the sharper the image. The more pixels there are the more detailed the image

CD Mastering

Even though many assume that the mixing of the separate audio tracks is the final step, a recording should always be mastered well in order to sound great. CD mastering is the final chance for creative input when you create a compact disc. After the discs has been mastered, it can be printed, reproduced, and then sold.
The process of mastering a CD actually involves several steps. The first step is putting the songs, or tracks at this point in the correct order. The length of time between the songs is also adjusted, along with the editing of the songs. Any unlisted or secret songs on the CD are normally added at this point as well. There are several ways that you can go about mastering a CD. First of all, the mix can be sent to a professional CD mastering engineer, which is what professional musicians normally decide to do.The mastering engineers will often work in their own mastering facilities, which are very different from standard studios, in the fact that they have much less gear and are designed for the best possible playback of the mix as possible in order to fix anything that's wrong.Aside from mastering engineers, CDs can also be mastered at home using computer software. This option is normally more realistic for unsigned artists or musicians who are just starting out with their music. Depending on the software quality and skill of the individual doing the mastering, the CD may turn out perfect or it may sound very unprofessional.
You can also refer to online CD mastering as another option. Cds that are mastered online can be great, as instead of sending a mix to a mastering engineer, the mix is instead sent via the Internet. To do this, you'll need a high speed Internet connection.The cheapest way to go about mastering a CD is with free mastering. Artists and musicians may choose to use free mastering programs with demos or other earlier recordings that artists will use to send to major record labels to generate some interest in their music.The major differences with a professional CD and an amateur recording is normally found in the mastering. Every song that you hear played on the radio is thoroughly mastered in order to sound better.While you can master using free programs or your computer, a professional CD mastering engineer is normally the best way to do business if your band is looking to make a profit from your music.

A new iPhone app that can tell how you look!

Want to know how ugly or good looking you are? A new iPhone application can tell you in moments. The Ugly Meter app lets users take a photo of a person's face and then "analyses" its contours in real-time before displaying a score out of 10. The higher the score, the uglier the face. An on-screen grid helps the user line up the picture for best results and, moments after the picture is taken, the app's caustic judgment is displayed on the screen. And when tested on the faces of some of the world's most glamorous celebrities, the results were a trifle controversial, the Daily Mail reported. In a head-to-head battle between fellow X-Factor judges Cheryl Cole and Dannii Minogue, it was the former Girls Aloud star who came out on top. She scored a respectable 4.2 which prompted the acerbic put down: "Wow you're ugly, is your doctor a vet?"But Dannii fared even worse, scoring a terrible 9.8 out of ten and was told: "You're so ugly, you could make a glass eye cry." While it is obviously just a bit of fun, there are some who may take issue with the Ugly Meter's judgment of curvaceous TV star Christina Hendricks. Despite the being the object of lust for millions of men across the world, the Mad Men actress' pale-faced beauty did not impress the gadget's scanner as she scored an top 'ugliness' score of 10 out of 10. However the Ugly Meter was rather more complimentary about Angelina Jolie, who scored an almost-perfect two out of 10 which earned her the admiring phrase: "You're so hot you make the sun jealous."But there was bad news for her husband Brad Pitt, who was beaten by David Cameron. The British prime minister might not be everyone's idea of a chisel-jawed hunk but he still managed to outpoint Brad by a single point. The app costs 99p to download from Apple's App Store and requires 5.5MB of free space on the iPhone to work.It was developed by app designers The Dapper Gentlemen.

Oscar forays into mobile handset market, to invest Rs. 100 Crore


goods manufacturer Oscar on Thursday forayed into the highly-competitive mobile handset market with 10 devices and will invest Rs. 100 crore on marketing this year."We will invest Rs. 100 crore this year on our product and marketing. By the end of first year, we hope to sell one lakh mobiles per month in North and East India with a turnover of Rs. 250 crores," Oscar Group Executive Director Suresh Wadhwani said.
Oscar has priced its products between Rs. 1,700 to Rs. 4,500. Initially, Oscar will work with manufacturers in China and source its products from there but will set up a unit in the country once it achieves 1 lakh units per month in sales.Oscar, which manufacturers colour televisions, home theatre systems, DVDs, and multi-media speaker systems is the latest entrant in the Rs. 70,000-crore handset market in India.Of the overall handset market, about 50 percent is cornered by the ultra-low cost handsets.The segment has witnessed at least one new mobile vendor every month. These new home grown entrants such as Micromax, Lava, Wynn Telecom, Lemon, Zen, Olive telecom are giving tough competition to well entrenched players ? Nokia, Samsung and Sony Ericsson.

HP ProLiant Servers



After the introduction of the HP ProLiant servers - the DL360 G7 and DL380 G7, HP announced the launch of G7 servers with AMD Operaton in India. Unlike, the DL360 and DL380 which offers a 20:1 greater consolidation ratio, the newly launched HP ProLiant DL165 G7 and HP ProLiant DL385 G7 with rack -optimized servers offer a ratio of 23:1 consolidation. In a cloud based infrastructure a lot depends on the virtual servers function. Rajesh Dhar Director, Industry Standard Servers, HP India, said "G7 is a completely virtual server which has been patented to provide Ethernet
The HP Proliant severs known as the "magnificent seven" forms a part of the HP Converged Infrastructure portfolio. It incorporates servers, storage, network devices and facility resources which would help companies to channelize their resources towards IT innovation.Dhar said that "HP had come up with the G7 ProLiant after a careful survey of 500 companies based across India, Australia, China and Japan.The research of 500 companies' shows that the challenges faced by companies include rigid infrastructure, information explosion and aging applications which stand as barriers for IT innovation. 7 out of 10 respondents felt that a lot of money goes into routine management for maintaining the operational cost. Dhar also said that, "India being the fourth largest market in Asia Pacific, 75 percent of the Indians are open to the idea of accepting innovation solutions."By offering the innovation solution HP plans to break free the money that is engaged in unnecessary funding and at the same time provide a self fund to re- structure and innovate.The HP ProLiant servers are inclusive of HP Thermal Logic technologies which helps in reducing 96 percent of the power consumption. The HP Sensors are able to offer cooling and efficiency solutions by automatically tracking thermal activity by collecting 32 smart sensors placed strategically throughout the server. These 32 sensors are placed in such a way that it does not disturb the server.Ramchandran V, National Product Manager, HP said, "Customers who look for high proficiency with less cost would gain benefit. HP's cooling solutions would help companies to save around $2300." He even revealed that the G7 is a step further than the G6 as it can save up to 10 kilobytes of power in a system. It also gets integrated into building management software."The existing G6 customers can also update their system to G7 ProLiant without any major difficulty. The HP solutions also help in training their partners and offer cash flow management.

X-Ray Satellite Homes in on a Black Hole's Jets


ScienceDaily — For decades, X-ray astronomers have studied the complex behavior of binary systems pairing a normal star with a black hole. In these systems, gas from the normal star streams toward the black hole and forms a disk around it. Friction within the disk heats the gas to millions of degrees -- hot enough to produce X-rays. At the disk's inner edge, near the black hole, strong magnetic fields eject some of the gas into dual, oppositely directed jets that blast outward at about half the speed of light.That's the big picture, but the details have been elusive. For example, do most of the X-rays arise from the jets? The disk? Or from a high-energy region on the threshold of the black hole?
Now, astronomers using NASA's Rossi X-ray Timing Explorer (RXTE) satellite, together with optical, infrared and radio data, find that, at times, most of the X-rays come from the jets."Theoretical models have suggested this possibility for several years, but this is the first time we've confirmed it through multiwavelength analysis," said David Russell, lead author of the study and a post-doctoral researcher at the University of Amsterdam.Russell and his colleagues looked at a well-studied outburst of the black-hole binary XTE J1550-564. The system lies 17,000 light-years away in the southern constellation of Norma and contains a black hole with about 10 times the sun's mass. The usually inconspicuous binary was discovered by RXTE in 1998, when the system briefly became one of the brightest X-ray sources in the sky.Between April and July 2000, the system underwent another outburst. RXTE monitored the event in X-rays, with some additional help from NASA's Chandra X-ray Observatory. Optical and infrared observations covering the outburst came from the YALO 1-meter telescope at Cerro Tololo Inter-American Observatory in Chile, while radio observations were collected by the Australia Telescope Compact Array.Drawing on these data, Russell and his team reconstructed a detailed picture of X-ray emission during the outburst. The study appears in the July 1 edition of Monthly Notices of the Royal Astronomical Society."We suspect that these outbursts are tied to increases in the amount of mass falling onto the black hole," explained Russell. "Where and how the emission occurs are the only clues we have to what's going on."
As the outburst began in mid-April 2000, the system's brightest X-ray emission was dominated by higher-energy ("hard") X-rays from a region very close to the black hole.
"We think the source of these X-rays is a region of very energetic electrons that form a corona around the innermost part of the disk," Russell said. When these electrons run into photons of visible light, the collision boosts the photons to hard X-ray energies, a process known as inverse Compton scattering. The jets were present, but only minor players.
Over the next couple of weeks, the peak X-ray emission moved to lower ("softer") energies and seems to have come from the dense gas in the accretion disk. At the same time, the hot disk quenched whatever process powers the jets and shut them down.
By late May 2000, XTE J1550-564's accretion disk was cool enough that the jets switched on again. Most of the X-rays, which were fainter but higher in energy, again came from scattering off of energetic electrons close to the black hole.
In early June, as the system faded and its peak emission gradually softened, the jets emerged as the main X-ray source. In the jet, electrons and positrons moving at a substantial fraction of light speed emit the radiation as they encounter magnetic fields, a process called synchrotron emission.The jets require a continuous supply of particles with energies of a trillion electron volts -- billions of times the energy of visible light. "The total energy bound up in the jet is enormous, much larger than previously thought," Russell said.As summer wore on, the jets gradually faded and their X-ray emission softened. By September, the system's brightest X-rays came from high-speed blobs of matter that the jets had hurled into space during previous eruptions."We're really beginning to get a handle on the 'ecology' of these extreme systems, thanks in large part to RXTE," Russell added. "We can apply what we've learned in nearby binaries like XTE J1550 to the supersized black holes and jets found at the centers of galaxies."Launched in 1995, RXTE is still going strong. "Of currently operating NASA missions, only Hubble has been working longer," said Tod Strohmayer, the mission's project scientist at NASA's Goddard Space Flight Center in Greenbelt, Md. RXTE's unique capabilities provide insight into accreting black holes and neutron stars and allow it to detect short, faint outbursts that are easily missed by other current missions exploring the X-ray regime.

ATM security flaws could be a jackpot for hackers

A security expert has identified flaws in the design of some automated teller machines that make them vulnerable to hackers, who could make the ubiquitous cash dispensers spit out their cash holdings.Barnaby Jack, head of research at Seattle-based, security firm IOActive Labs, will demonstrate methods for "jackpotting" ATMs at the Black Hat security conference in Las Vegas that starts on July 28.
"ATMs are not as secure as we would like them to be," Jeff Moss, founder of the Black Hat conference and a member of President Obama's Homeland Security Advisory Council said. "Barnaby has a number of different attacks that make all the money come out."
Jack declined to discuss his techniques before the conference. The world's biggest ATM manufacturers include Diebold Inc and NCR Corp. Officials with those companies could not be reached for comment.Banks may cringe when he speaks, fearing would-be crooks will adopt his methods. But Moss said that going public will raise awareness of the problem among ATM operators and prompt them to tighten security.One potential route of attack is via communications ports that are sometimes accessible from outside an ATM, Moss said."You want everybody to know there are possible ways to jackpot these machines, so they will go and get their machines updated," he said.Joe Grand, a hardware security expert, said he was not surprised to learn of Jack's research."People are starting to realize that hardware products do have security vulnerabilities. Parking meters, ATMs, everything that has electronics in it can be broken," Grand said. "A lot of times a hardware product is just a computer in a different shell.

'Butterfly Effect' in the Brain Makes the Brain Intrinsically Unreliable


ScienceDaily — Next time your brain plays tricks on you, you have an excuse: according to new research by UCL scientists published June 30 in the journal Nature, the brain is intrinsically unreliable.
This may not seem surprising to most of us, but it has puzzled neuroscientists for decades. Given that the brain is the most powerful computing device known, how can it perform so well even though the behaviour of its circuits is variable?

A long-standing hypothesis is that the brain's circuitry actually is reliable -- and the apparently high variability is because your brain is engaged in many tasks simultaneously, which affect each other.

It is this hypothesis that the researchers at UCL tested directly. The team -- a collaboration between experimentalists at the Wolfson Institute for Biomedical Research and a theorist, Peter Latham, at the Gatsby Computational Neuroscience Unit -- took inspiration from the celebrated butterfly effect -- from the fact that the flap of a butterfly's wings in Brazil could set off a tornado in Texas. Their idea was to introduce a small perturbation into the brain, the neural equivalent of butterfly wings, and ask what would happen to the activity in the circuit. Would the perturbation grow and have a knock-on effect, thus affecting the rest of the brain, or immediately die out?
It turned out to have a huge knock-on effect. The perturbation was a single extra 'spike', or nerve impulse, introduced to a single neuron in the brain of a rat. That single extra spike caused about thirty new extra spikes in nearby neurons in the brain, most of which caused another thirty extra spikes, and so on. This may not seem like much, given that the brain produces millions of spikes every second. However, the researchers estimated that eventually, that one extra spike affected millions of neurons in the brain.

"This result indicates that the variability we see in the brain may actually be due to noise, and represents a fundamental feature of normal brain function," said lead author Dr. Mickey London, of the Wolfson Institute for Biomedical Research, UCL.

This rapid amplification of spikes means that the brain is extremely 'noisy' -- much, much noisier than computers. Nevertheless, the brain can perform very complicated tasks with enormous speed and accuracy, far faster and more accurately than the most powerful computer ever built (and likely to be built in the foreseeable future). The UCL researchers suggest that for the brain to perform so well in the face of high levels of noise, it must be using a strategy called a rate code. In a rate code, neurons consider the activity of an ensemble of many neurons, and ignore the individual variability, or noise, produced by each of them.
So now we know that the brain is truly noisy, but we still don't know why. The UCL researchers suggest that one possibility is that it's the price the brain pays for high connectivity among neurons (each neuron connects to about 10,000 others, resulting in over 8 million kilometres of wiring in the human brain). Presumably, that high connectivity is at least in part responsible for the brain's computational power. However, as the research shows, the higher the connectivity, the noisier the brain. Therefore, while noise may not be a useful feature, it is at least a by-product of a useful feature.