Translate

четверг, 28 мая 2015 г.

7 замечательных советов по использованию фольги

Алюминий – один из самых распространенных металлов на Земле. Он обладает уникальными свойствами, в том числе устойчив к образованию ржавчины. Приблизительно 75% добытого алюминия до сих пор используется благодаря утилизации.
Алюминиевая фольга была изобретена в 1903 году, тогда же начался ее выпуск в США. Сегодня этот незаменимый материал нашел свое применение в каждом доме. В него упаковывают продукты, используют для приготовления пищи и даже применяют для усиления сигнала телевизионных антенн.
Мы рады предложить вам еще 7 полезных советов по применению фольги, которые пригодятся вам в быту.

1. Пицца на завтра
Если у вас остались кусочки пиццы, положите их в фольгу и отправьте в морозилку. В следующий раз, когда вы захотите немного подкрепиться, просто отправьте вашу пиццу прямо в фольге в разогретую духовку на 10–15 минут.

2. Жесткая мочалка для посуды
Если у вас нет специальной губки для плохо отмывающихся частиц на посуде, вы может сделать шарик из фольги. С его помощью все загрязнения быстро очистятся.

3. Стильная рамка для фото
Вам понадобится фольга, рамка, специальный клей и краска для металла. Нанесите клей на рамку и приложите фольгу. Обрежьте приклеенную фольгу по контуру. Нанесите желаемый цвет краски. Готово!

4. Точилка для ножниц
Если ваши ножницы затупились, вы очень просто сможете наточить их с помощью фольги. Просто сделайте несколько надрезов на фольге, после чего проверьте ножницы на другом материале.

5. Ускоренная глажка
Под чехол на всей поверхности гладильной доски проложите фольгу. В процессе глажки, в частности, простыней или скатертей, вы сможете быстрее справиться с участками мятых мест, так как фольга будет отражать тепло утюга.


6.Эффективная работа радиатора.
Есть простой способ получить больше тепла от старого чугунного радиатора: поместите за радиатором теплоотражатель. Для этого прикрепите скотчем толстую алюминиевую фольгу к картону блестящей стороной наружу. Тепло от радиатора отражается фольгой и идет в помещение, а не поглощается стеной за радиатором. Если радиатор накрыт сверху экраном, можно прикрепить лист фольги и под ним.


7.Изготовление воронки
Никак не найдете воронку? Сложите вдвое длинный лист толстой фольги и сверните конусом. У самодельной воронки есть преимущество: ее можно согнуть, чтобы достать до труднодоступных отверстий.

Закрепление шатающихся батареек. Ваш фонарик, радиотелефон или игрушка у ребенка то и дело выключаются? Проверьте крепление батареек. Пружины, их удерживающие, со временем ослабевают, и контакт нарушается. Сложите кусочек фольги. Вставьте сложенную фольгу между батарейкой и пружиной.

The Future Fabric of Data Analysis

The Future Fabric of Data Analysis



When subatomic particles smash together at the Large Hadron Collider in Switzerland, they create showers of new particles whose signatures are recorded by four detectors. The LHC captures 5 trillion bits of data — more information than all of the world’s libraries combined — every second. After the judicious application of filtering algorithms, more than 99 percent of those data are discarded, but the four experiments still produce a whopping 25 petabytes (25×1015bytes) of data per year that must be stored and analyzed. That is a scale far beyond the computing resources of any single facility, so the LHC scientists rely on a vast computing grid of 160 data centers around the world, a distributed network that is capable of transferring as much as 10 gigabytes per second at peak performance.

The LHC’s approach to its big data problem reflects just how dramatically the nature of computing has changed over the last decade. Since Intel co-founder Gordon E. Moore first defined it in 1965, the so-called Moore’s law — which predicts that the number of transistors on integrated circuits will double every two years — has dominated the computer industry. While that growth rate has proved remarkably resilient, for now, at least, “Moore’s law has basically crapped out; the transistors have gotten as small as people know how to make them economically with existing technologies,” said Scott Aaronson, a theoretical computer scientist at the Massachusetts Institute of Technology.


Instead, since 2005, many of the gains in computing power have come from adding more parallelism via multiple cores, with multiple levels of memory. The preferred architecture no longer features a single central processing unit (CPU) augmented with random access memory (RAM) and a hard drive for long-term storage. Even the big, centralized parallel supercomputers that dominated the 1980s and 1990s are giving way to distributed data centers and cloud computing, often networked across many organizations and vast geographical distances.

These days, “People talk about a computing fabric,” said Stanford University electrical engineerStephen Boyd. These changes in computer architecture translate into the need for a different computational approach when it comes to handling big data, which is not only grander in scope than the large data sets of yore but also intrinsically different from them.

The demand for ever-faster processors, while important, isn’t the primary focus anymore. “Processing speed has been completely irrelevant for five years,” Boyd said. “The challenge is not how to solve problems with a single, ultra-fast processor, but how to solve them with 100,000 slower processors.” Aaronson points out that many problems in big data can’t be adequately addressed by simply adding more parallel processing. These problems are “more sequential, where each step depends on the outcome of the preceding step,” he said. “Sometimes, you can split up the work among a bunch of processors, but other times, that’s harder to do.” And often the software isn’t written to take full advantage of the extra processors. “If you hire 20 people to do something, will it happen 20 times faster?” Aaronson said. “Usually not.”

Researchers also face challenges in integrating very differently structured data sets, as well as the difficulty of moving large amounts of data efficiently through a highly distributed network.

Those issues will become more pronounced as the size and complexity of data sets continue to grow faster than computing resources, according to California Institute of Technology physicist Harvey Newman, whose team developed the LHC’s grid of data centers and trans-Atlantic network. He estimates that if current trends hold, the computational needs of big data analysis will place considerable strain on the computing fabric. “It requires us to think about a different kind of system,” he said.

Strange Object Found -- "The Long-Sought 'Missing Link' That Creates a Neutron Star or Black Hole"

Strange Object Found -- "The Long-Sought 'Missing Link' That Creates a Neutron Star or Black Hole"

Image_1774e-Cassiopeia-A
 The object, called Supernova 2012ap (SN 2012ap) is what astronomers term a core-collapse supernova. This type of blast occurs when the nuclear fusion reactions at the core of a very massive star no longer can provide the energy needed to hold up the core against the weight of the outer parts of the star. The core then collapses catastrophically into a superdense neutron star or a black hole. The rest of the star's material is blasted into space in a supernova explosion.
Astronomers using the National Science Foundation's Very Large Array (VLA) have found a long-sought "missing link" between supernova explosions that generate gamma-ray bursts (GRBs) and those that don't. The scientists found that a stellar explosion seen in 2012 has many characteristics expected of one that generates a powerful burst of gamma rays, yet no such burst occurred.
"This is a striking result that provides a key insight about the mechanism underlying these explosions," said Sayan Chakraborti, of the Harvard-Smithsonian Center for Astrophysics (CfA). "This object fills in a gap between GRBs and other supernovae of this type, showing us that a wide range of activity is possible in such blasts," he added.
The most common type of such a supernova blasts the star's material outward in a nearly-spherical bubble that expands rapidly, but at speeds far less than that of light. These explosions produce no burst of gamma rays.
In a small percentage of cases, the infalling material is drawn into a short-lived swirling disk surrounding the new neutron star or black hole. This accretion disk generates jets of material that move outward from the disk's poles at speeds approaching that of light. This combination of a swirling disk and its jets is called an "engine," and this type of explosion produces gamma-ray bursts.
The new research shows, however, that not all "engine-driven" supernova explosions produce gamma-ray bursts.
"This supernova had jets moving at nearly the speed of light, and those jets were quickly slowed down, just like the jets we see in gamma-ray bursts," said Alicia Soderberg, also of CfA.

Image_2740_1e-SN-2012ap
 An earlier supernova seen in 2009 also had fast jets, but its jets expanded freely, without experiencing the slowdown characteristic of those that generate gamma-ray bursts. The free expansion of the 2009 object, the scientists said, is more like what is seen in supernova explosions with no engine, and probably indicates that its jet contained a large percentage of heavy particles, as opposed to the lighter particles in gamma-ray-burst jets. The heavy particles more easily make their way through the material surrounding the star.
"What we see is that there is a wide diversity in the engines in this type of supernova explosion," Chakraborti said. "Those with strong engines and lighter particles produce gamma-ray bursts, and those with weaker engines and heavier particles don't," he added.
"This object shows that the nature of the engine plays a central role in determining the characteristics of this type of supernova explosion," Soderberg said.
Chakraborti and Soderberg worked with an international team of scientists from five continents. In addition to the VLA, they also used data from the Giant Meterwave Radio Telescope (GMRT) in India and the InterPlanetary Network (IPN) of spacecraft equipped with GRB detectors. The team, led by Chakraborti, is reporting their work in a paper accepted to the Astrophysical Journal. Other articles, led by co-authors Raffaella Margutti and Dan Milisavljevic, also report on the X-ray and optical follow-up on SN 2012ap using a suite of space and ground-based facilities.
In 2007 NASA’s Spitzer space telescope found the infrared signature of silica (sand) in the core-collapse supernova remnant Cassiopeia A shown at the top of the page The light from this exploding star first reached Earth in the 1600s. The cyan dot just off center is all that remains of the star that exploded. NASA/JPL-Caltech/ O. Krause (Steward Observatory)
Researchers from Washington University in St. Louis report finding tiny grains of silica, which they believe came from a core-collapse supernova, in primitive meteorites.
The Daily Galaxy via NRAO