Thursday 21 January 2010

Snail Shell Could make Body Armur


A deep-sea snail wears a multi-layered suit of armor, complete with iron, new research shows. Dissecting details of the shell’s structure could inspire tough new materials for use in everything from body armor to scratch-free paint.

sciencenews“If you look at the individual properties of the bits and pieces that go into making this shell, they’re not very impressive,” comments Robert Ritchie of the University of California, Berkeley. “But the overall thing is.”

The snail, called the scaly-foot gastropod, was discovered nearly a decade ago living in a hydrothermal vent field in the Indian Ocean. In its daily life, the snail encounters extreme temperatures, high pressures and high acidity levels that threaten to dissolve its protective shell. Worse, it is hunted by crabs that try to crush the mollusk between strong claws.

To understand how the valiant gastropod holds up to these trials, Christine Ortiz of MIT and her colleagues used nanoscale experiments and computer simulations to dig in to the shell’s structure. Many other species’ shells exhibit what Ortiz calls “mechanical property amplification,” in which the whole material is hundreds of times stronger than the sum of its parts.

snail_shell_bsarThe scaly-foot snail’s shell employs a structure “unlike any other known mollusk or any other known natural armor,” the researchers report January 19 in Proceedings of the National Academy of Sciences. Ortiz and her colleagues found that the shell consists of a 250-micrometer-thick inner layer of aragonite, a common shell material, sheathed in a 150-micrometer-thick layer of squishy organic materials. The organic layer is encased in a thin, stiff outer layer (about 30 micrometers thick) made of hard iron sulfide–based scales. The gastropod wears larger versions of the scales on its exposed foot.

“Most mollusks only have a relatively thin outer organic layer followed by inner calcified layers,” Ortiz says. But the snail’s organic layer is surprisingly thick, and no other gastropod has ever been shown to use iron sulfide in its shell.

Each of the shell’s layers plays a unique role in protecting the snail from crab attacks, Ortiz found. The researchers measured material properties like stiffness and fracture resistance, and fed them into a computational model of a predator penetrating the armor.

The model showed that the outer layer, the shell’s “first line of defense,” sacrificed itself by cracking slightly under pressure. But the cracks were branched and jagged, dissipating energy widely through the shell and keeping any one crack from spreading too far. The iron-based scales could shift and roughen the shell’s surface during a crab attack, which in turn would grind down the attacking claw, the researchers suggest.

The soft organic middle layer changed shape in response to pressure, keeping the brittle inner layer from feeling too much of the pinch. Organic material could also insert itself in any cracks that formed in either sandwiching layer and keep the crack from spreading. Plus, the middle layer together with the outer layer protects against acidic waters and may also help shield the snail from high temperatures.

The shell’s curvature also helped reduce stress on the calcified inner layer. The inner layer’s rigidity provided structural support, to keep the whole shell from caving in.

“It shows that by changing the geometry of the materials … you can improve their properties quite significantly,” comments Markus Buehler of MIT, who was not involved in the research.

Ortiz hopes that studying the snail’s shell could one day lead to improved materials for armor or helmets for people. Studying organisms that have been optimized for extreme environments through millions of years of evolution could offer ideas that engineers would never think of on their own, she says.

But it will probably be a while, Ritchie cautions. His lab built a ceramic material based on mother-of-pearl in 2008.

Thursday 14 January 2010

FAT32 Vs NTFS






Choosing the file system to use on a Windows XP system is seldom easy, and frequently it's not just a one time decision.. Different factors can blur the decision process, and some tradeoffs are more than likely. No matter what method you choose to adopt Windows XP, you will have to face the FAT32 versus NTFS decision. Clean and upgrade installs both require you to address the situation early on in the process. Later on, if you add a drive or repartition an existing drive the decision process faces you yet again. Circumstances may dictate the choice for you, but in most cases the options have to be weighed and the tradeoffs of using each method analyzed. Let's look at the available choices.

File System Choices

Most articles discussing file system choices look at FAT32 and NTFS as the two available choices. In reality, there are three systems which could be selected. FAT, FAT32, and NTFS. Granted, FAT32 and NTFS are the primary choices, but on occasion you'll still find the need for a FAT volume. A FAT volume has a maximum size of 2GB and supports MS-DOS as well as being used for some dual boot configurations, but backward compatibility is about the only reason I can think of that FAT should ever be used, other than for the occasional floppy diskette. That said, let's move on to FAT32 and NTFS.

FAT (FAT16) is limited to 2.1 GB partitions. That's totally different.

FAT32 is, theoretically, limited to 2.1 TB partitions, but it never really lasted long enough in the face of NTFS for most to bother.

What the TC is talking about is having a single file larger than 4 GB on a FAT32 drive. This is not possible. 4 GB is the limit.

Either convert to NTFS or some Linux-based file system.

Which File System to Choose?

As much as everyone would like for there to be a stock answer to the selection question, there isn't. Different situations and needs will play a large role in the decision of which file system to adopt. There isn't any argument that NTFS offers better security and reliability. Some also say that NTFS is more flexible, but that can get rather subjective depending on the situation and work habits, whereas NTFS superiority in security and reliability is seldom challenged. Listed below are some of the most common factors to consider when deciding between FAT32 and NTFS.

* Security

FAT32 provides very little security. A user with access to a drive using FAT32 has access to the files on that drive.

NTFS allows the use of NTFS Permissions. It's much more difficult to implement, but folder and file access can be controlled individually, down to an an extreme degree if necessary. The down side of using NTFS Permissions is the chance for error and screwing up the system is greatly magnified.

Windows XP Professional supports file encryption.

* Compatibility

NTFS volumes are not recognized by Windows 95/98/Me. This is only a concern when the system is set up for dual or multi-booting. FAT32 must be be used for any drives that must be accessed when the computer is booted from Windows 95/98 or Windows Me.

An additional note to the previous statement. Users on the network have access to shared folders no matter what disk format is being used or what version of Windows is installed.

FAT and FAT32 volumes can be converted to NTFS volumes. NTFS cannot be converted to FAT32 without reformatting.

* Space Efficiency

NTFS supports disk quotas, allowing you to control the amount of disk usage on a per user basis.

NTFS supports file compression. FAT32 does not.

How a volume manages data is outside the scope of this article, but once you pass the 8GB partition size, NTFS handles space management much more efficiently than FAT32. Cluster sizes play an important part in how much disk space is wasted storing files. NTFS provides smaller cluster sizes and less disk space waste than FAT32.

In Windows XP, the maximum partition size that can be created using FAT32 is 32GB. This increases to 16TB (terabytes) using NTFS. There is a workaround for the 32GB limitation under FAT32, but it is a nuisance especially considering the size of drives currently being manufactured.

* Reliability

FAT32 drives are much more susceptible to disk errors.

NTFS volumes have the ability to recover from errors more readily than similar FAT32 volumes.

Log files are created under NTFS which can be used for automatic file system repairs.

NTFS supports dynamic cluster remapping for bad sectors and prevent them from being used in the future.

NTFS vs FAT32

The Final Choice

As the prior versions of Windows continue to age and are replaced in the home and workplace there will be no need for the older file systems. Hard drives aren't going to get smaller, networks are likely to get larger and more complex, and security is evolving almost daily as more and more users become connected. For all the innovations that Windows 95 brought to the desktop, it's now a virtual dinosaur. Windows 98 is fast on the way out and that leaves NT and Windows 2000, both well suited to NTFS. To wrap up, there may be compelling reasons why your current situation requires a file system other than NTFS or a combination of different systems for compatibility, but if at all possible go with NTFS. Even if you don't utilize its full scope of features, the stability and reliability it offers make it the hands down choice.

Why an External Hard drive of 1Tb shows only 930 GB of space ?


There is a difference in how Hard Disk Capacities are stated by manufacturers compared to how operating systems calculate them.

To Hard disk manufacturers a kilobyte is 1000 byes, a megabyte is 1000 kilobytes , a gigabyte is 1000 megabytes and a terabyte is 1000 gigabytes = 1,000,000,000,000 bytes. However to most operating systems a kilobye is 1024* bytes, a megabyte is 1024 kilobytes (=1,048,576 bytes) and a gigabyte is 1024 megabytes (1,073,741,824 bytes), etc.

So when a hard disk manufacturer says a hard disk has a capacity of 1 Terabye it means the capacity is 1,000,000,000,000 bytes +/- a few percent. For demonstration purposes let's assume that they mean exactly 1,000,000,000,000 bytes. When the operating system calculates that capacity in Gigabytes it divides the 1,000,000,000,000 bytes by 1,073,741,824 bytes/gigabyte which equals 931.3 gigabytes.

So the reported 930 GB is about right. There is no space missing.

This difference in how hard disk capacities are stated has existed since the days when a large hard disk was only 5 MB.


* 1024 = 2^10 and is the power of 2 that is closest to 1000.

Ferropaper

The material is made by impregnating ordinary paper -- even newsprint -- with a mixture of mineral oil and "magnetic nanoparticles" of iron oxide. The nanoparticle-laden paper can then be moved using a magnetic field.


"Paper is a porous matrix, so you can load a lot of this material into it," said Babak Ziaie, a professor of electrical and computer engineering and biomedical engineering.

The new technique represents a low-cost way to make small stereo speakers, miniature robots or motors for a variety of potential applications, including tweezers to manipulate cells and flexible fingers for minimally invasive surgery.

"Because paper is very soft it won't damage cells or tissue," Ziaie said. "It is very inexpensive to make. You put a droplet on a piece of paper, and that is your actuator, or motor."

Once saturated with this "ferrofluid" mixture, the paper is coated with a biocompatible plastic film, which makes it water resistant, prevents the fluid from evaporating and improves mechanical properties such as strength, stiffness and elasticity.

Findings will be detailed in a research paper being presented during the 23rd IEEE International Conference on Micro Electro Mechanical Systems on Jan. 24-28 in Hong Kong. The paper was written by Ziaie, electrical engineering doctoral student Pinghung Wei and physics doctoral student Zhenwen Ding.

Because the technique is inexpensive and doesn't require specialized laboratory facilities, it could be used in community colleges and high schools to teach about micro robots and other engineering and scientific principles, Ziaie said.

The magnetic particles, which are commercially available, have a diameter of about 10 nanometers, or billionths of a meter, which is roughly 1/10,000th the width of a human hair. Ferro is short for ferrous, or related to iron.

"You wouldn't have to use nanoparticles, but they are easier and cheaper to manufacture than larger-size particles," Ziaie said. "They are commercially available at very low cost."

The researchers used an instrument called a field-emission scanning electron microscope to study how well the nanoparticle mixture impregnates certain types of paper.

"All types of paper can be used, but newspaper and soft tissue paper are especially suitable because they have good porosity," Ziaie said.

The researchers fashioned the material into a small cantilever, a structure resembling a diving board that can be moved or caused to vibrate by applying a magnetic field.

"Cantilever actuators are very common, but usually they are made from silicon, which is expensive and requires special cleanroom facilities to manufacture," Ziaie said. "So using the ferropaper could be a very inexpensive, simple alternative. This is like 100 times cheaper than the silicon devices now available."

The researchers also have experimented with other shapes and structures resembling Origami to study more complicated movements.

The research is based at the Birck Nanotechnology Center in Purdue's Discovery Park.

Sunday 10 January 2010

The History of ATM Machines

The history of the ATM dates back to New York City in 1939 when inventor Luther George Simjian got a bank to publicly try the machine. The effort failed due to lack of customer interest at that time. In 1960 a bank in New York City had a cash machine predecessor (the Bankograph) installed that would free up tellers by accepting utility bill payments.

The next automated cash dispenser development was developed in 1964. An electronic ATM was set out in North London (UK) but it was very different in the way it worked than modern equipment. This machine would dispense ten pound sterling amounts of cash in exchange for a teller purchased voucher.
The current machine style was a creation of British engineer James Goodfellow. In 1965 he patented the cash machines that were the forerunners of what we use today. There was one type of ATM introduced in 1968 that always ate the prepaid plastic card and users would then have to buy a replacement from a teller.

In 1969, Donald C. Wetzel developed for Docutel the first machines utilizing the cards with magnetic strips. Since Docutel was the first company to get a patent for this type of machine, the Smithsonian Museum gives them credit for being the originator. The public still had problems with accepting and trusting money machines. The machine proved to be very costly to operate.

Docutel led the way to the modern ATM machine in 1971 when they produced a full-function ATM called Total Teller. By 1973, these machines were capable of issuing cash in variable amounts. By 1974 the online networking component was added which led to ATMs as we know them now.

Today automated cash machines are more common than drinking fountains and are so prevalent they are very easy to locate. They are found worldwide, even as distant as Antarctica. The ability to use a small plastic debit or credit card to withdraw cash as needed from these machines is just so easy. Some banks use ATM stations for speed and convenience, replacing regular human tellers. The popularity is not surprising considering the convenience. Consumers no longer be concerned about carrying cash, a stack of credit cards, checks or other financial instruments. ATM machines can be adjusted for ticket selling, concert ticket sales and gift certificates.

What did businesses and the public do before the invention of the ATM machine business? Prior to the availability of ATM machines for sale it was necessary to make a trip to the bank during regular business hours for a cash withdrawal. Another alternative was to carry personal checks around and overwrite the amount at the grocery store or other locations that might allow that type of transaction.

There is a growing ATM machine business opportunity for those willing to bring a machine into their store or business location. Modern automated teller machines can access many different interbank networks. Most banks and retail outlets make money by charging a usage fee when the ATM is used.

Saturday 9 January 2010

Computer Tips

Know who is using your PC During Your Absence

What to do if Task Manager is disabled by Administrator

Start a movie using Paint !!

Speed Up your PC !!

Hide your Drives (C: D: ...) !!

Thursday 7 January 2010

Comparing The Brain To The Rest Of The Universe ...


“It is widely believed that physics provides a complete catalogue of the universe's fundamental features and laws. As physicist Steven Weinberg puts it in his 1992 book Dreams of a Final Theory, the goal of physics is a "theory of everything" from which all there is to know about the universe can be derived. But Weinberg concedes there is a problem with consciousness.

Despite the power of physical theory, the existence of consciousness does not seem derivable from physical laws. He defends physics by arguing that it might eventually explain what he calls the objective correlates of consciousness (that is, the neural correlates), but of course to do this is not to explain consciousness itself. If the existence of consciousness cannot be derived from physical laws, a theory of physics is not a true theory of everything. So a final theory must contain an additional fundamental component.”


-Chalmers, David J: The Puzzle Of Conscious Experience http://consc.net/papers/puzzle.html

“It is widely accepted that conscious experience has a physical basis. That is, the properties of experience (phenomenal properties, or qualia) systematically depend on physical properties according to some lawful relation. There are two key questions about this relation. The first concerns the strength of the laws: are they logically or metaphysically necessary, so that consciousness is nothing "over and above" the underlying physical process, or are they merely contingent laws like the law of gravity?

This question about the strength of the psychophysical link is the basis for debates over physicalism and property dualism. The second question concerns the shape of the laws: precisely how do phenomenal properties depend on physical properties? What sort of physical properties enter into the laws' antecedents, for instance; consequently, what sort of physical systems can give rise to conscious experience?”


-Chalmers, David J: Absent Qualia, Fading Qualia, Dancing Qualia,
http://consc.net/papers/qualia.html


Are We Psychophysical?

Psychophysicalism, underlying the mind/body relation, holds that consciousness requires brains to exist. If consciousness cannot exist unless it is generated by a brain, it follows that no instance of consciousness can exist without a neural circuit corresponding to and giving rise to that experience. What do you think? Do you believe that if there are no brains (or no functioning brains) there is (or there is no longer) consciousness or conscious experience?

If Psychophysicalism is true, consciousness does not exist where there are no functioning brains. Thus it is within neurons, out of every other object in the universe, that consciousness arises. If consciousness arises “outside” neurons rather than within them—why state the brain has anything to do with consciousness?.

The purpose of this paper is to compare the brain substantially and compositionally with every other object in the universe (in accord with secular mythology concerning the substantial and compositional nature of the brain and every other object in the universe) to discern whether or not the brain possesses a quality or property missing from the remainder of the universe, as the absence of this quality or property—which should exist even if consciousness does not (in cases of permanent vegetative state or coma)—makes the purported ability of the brain to create or generate subjective experience absurd.

The Secular Origin Of The World

In Psychophysicalism, consciousness cannot exist unless and until there are brains, and brains seem to exist upon a particular planet. The brain is composed of specialized cells, and bio-evolutionary theory holds that multicellular organisms descend from a single self-replicating cell. Life’s pathway from single cell to the human brain (as the most significant example of brain and consciousness) is entailed to occur upon a particular planet, itself formed from the fallout of physical events occurring 10-15 billion years ago:

Without the laws of physics as we know them, life on earth as we know it would not have evolved in the short span of six billion years. The nuclear force was needed to bind protons and neutrons in the nuclei of atoms; electromagnetism was needed to keep atoms and molecules together; and gravity was needed to keep the resulting ingredients for life stuck to the surface of the earth.

These forces must have been in operation within seconds of the start of the big bang, 10-15 billion years ago, to allow for the formation of protons and neutrons out of quarks and their storage in stable hydrogen and deuterium atoms.

Free neutrons disintegrate in minutes. To be able to hang around for billions of years so that they could later join with protons in making chemical elements in stars, neutrons had to be bound in deuterons and other light nuclei where energetics prevented their decay.

Gravity was needed to gather atoms together into stars and to compress stellar cores, raising the core temperatures to tens of millions of degrees. These high temperatures made nuclear reactions possible, and over billions of years the elements of the chemical periodic table were synthesized as the by-product.

When the nuclear fuel in the more massive, faster-burning stars was spent, the laws of physics called for them to explode as supernovae, sending into space the elements manufactured in their cores.

In space, gravity could gather these elements into planets circling the smaller, longer-lived stars. Finally, after about ten billion years, the carbon, oxygen, nitrogen and other elements on a small planet attached to a small, stable star could begin the process of evolution toward the complex structures we call life.

The secular myth also holds that everything in daily experience, extrapolated to the formation of the planet itself, are composed only of up quarks, down quarks, and electrons.

The Particles

Since the dawn of science, people have been wondering what the universe is made up of; what the most fundamental objects are in the universe. Well, the answer to this has changed over the years, and what you see here may or may not be the final answer, but it's the best answer we have right now.

The most fundamental particles we know about can be divided into three categories: quarks, leptons, and gauge bosons. Within the spectrum of these particles, there are several patterns that emerge. Historically, when patterns emerge in a collection of particles, this is an indication of the substructure of the particles. That is, it's an indication of what the particles that make up the particles are like. For example, some years ago, Mendeleyev arranged all the known elements according to their chemical properties into an array which we call the periodic table. The patterns he discovered are indicative of the fact that these elements are made up of protons, neutrons, and electrons. As another example, earlier this century physicists noticed patterns within a group of particles called "hadrons." Two physicists, Gell-Mann and Zweig, discovered that these patterns could be explained if hadrons are composed of more fundamental particles, now called quarks.

And so the patterns observed in the spectrum of quarks, leptons, and gauge bosons may be indications that these are made up of other particles. However, so far there's no evidence that this is the case, and no indication of what these smaller particles might be like. So these particles, as far as we now know, are made up of nothing smaller.

The Moral Of The Story:

There's nothing about neurons in terms of their atomic or subatomic structure, given that every other object (at least on Earth if not the universe—i.e. dark matter and energy) are composed of the same types of particles (with all particles identical in nature to every other particle in the universe of its type).

If the only stable particles in the universe (that we know of so far) are up quarks, down quarks, and electrons, there is no structural difference---in terms of particles going into the makeup of the objects of everyday experience ---between brains, trashcan lids, DVD movie covers, tampons, and everything else. The upshot of this is that there is nothing special about the brain in comparison to everything else in the universe in terms of it’s physical structure If neurons somehow create subjective experience, it is far from obvious how subjective experience arises (or why it must arise) by reason of u, d, quarks and electrons Lego-blocking into brains as opposed to their Lego-blocking anything else.

Appearances Are Not Deceiving: The Apparent Difference Between Neurons...And The Remainder Of The Universe

In the quest to demonstrate that, while it is believed that neurons give rise to or create subjective experience, neurons themselves (neurons qua neurons) are not subjective experiences. It is believed that neurons give rise to subjective experiences, but apparently (visually), neurons certainly cannot be mistaken for the experiences to which they give rise.

Example: An orange fruit appears like other orange fruits in the vicinity, but an orange cannot be mistaken for an apple (unless we shift semantics overnight to call apples 'oranges')


To put the issue differently, even once it is accepted that experience arises from physical systems, the question remains open: in virtue of what sort of physical properties does conscious experience arise? Some property that brains can possess will presumably be among them, but it is far from clear just what the relevant properties are. Some have suggested biochemical properties; some have suggested quantum-mechanical properties; many have professed uncertainty. A natural suggestion is that when experience arises from a physical system, it does so in virtue of the system's functional organization. On this view, the chemical and indeed the quantum substrates of the brain are not directly relevant to the existence of consciousness, although they may be indirectly relevant.

(Chalmers, David J: Absent Qualia, Fading Qualia, Dancing Qualia,
http://consc.net/papers/qualia.html)


But aside from a heretofore unknown or magical feature of the brain found nowhere else in the universe, and given that, at least according to what we know of fundamental particles in terms of mass and stability (that only massless particles are stable i.e. they have not been observed to decay into smaller particles) brains are made up only of up and down quarks and electrons. Function, however, means movement or in the case of stable or robust machines, repetitive or contingent movement of components in circumscribed or predictable ways.

In electronics, this “repetitive or contingent movement” ultimately involves the staging of electrically conductive materials and objects relative to each other in abstract patterns in order to circumscribe and route (“control through a maze”) a particular or varying amount of electrical energy in such a way as to force it to perform a particular task or produce a particular effect (output).


The brain is no different. Action potentials circumscribe and route electrical energy (typically) from dendrites, through the soma or body of the nerve cell, to the axon in order to release neurotransmitters that activate the next neuron in line and so on (through the membrane surrounding the neuron). It is the passage of electrical energy through more than one neuron that counts as “brain function” or “nervous system function” (in non-brain neurons). As a general rule, two or more neurons establish a nervous system (at least insofar as humans and most other animals are concerned).

But if brains are nothing but Lego-creations of up quarks, down quarks, and electrons, and given that up quarks and down quarks form protons and neutrons and protons and neutrons do nothing else but form the nucleus of atoms (thus protons and neutrons are---in general---good only for holding an atom together as a discrete unit), the “function of the brain” is ultimately mediated by electrons.

Electric current

Electric Current is the flow of electrons through a wire or solution. In a solid the electrons are passed from one positively charged metallic atom to next but in solution the electron is carried by the ions present in the solution. A solution capable of carrying charge is called an electrolyte. Electrolyte solutions are found in batteries as well as in all living things.

• Is measured according to how many electrons pass a given point each second.
• The symbol for current is I
• The unit of measurement is the amperes (A) or amp (1coloumb/second or 6.24 x 106 electrons)
• The net charge on the wire carrying the current is zero.


Conclusion

“There's something happenin' here
What it is ain't exactly clear..”


-Buffalo Springfield, For What It's Worth


In the end, let’s face it: neurons are neurons, and non-neurons are non-neurons. There are neurons, and there are volcanoes, there are neurons, and there is the subjective, inward feeling of happiness. Neurons to the bone are just cells that conduct electricity. As Chalmers states:

“…everything in physical theory is compatible with the absence of consciousness.”

If this is true, then it is coherently imaginable that cortical neurons can in principle conduct electricity with nothing occurring save motor responses in the body (if non-mental bodies and brains exist). To say this is impossible is to propose a necessary connection between consciousness and the physical that can be argued to be nothing more than a psychophysical connection of imaginary force; an imagined inextricableness of consciousness from neurons not mirrored by objective reality.

If consciousness has nothing at all to do with neurons, asks the psychophysicalist and non-mentalist, then why are there brains at all? Why are there reports of correspondence between neural manipulation and experience?

A paper published recently in the journal Nature (vol.391, page 650, 1998) called "Electric Current Stimulates Laughter" has provided a bit more information about how the brain is involved with laughter. The paper discussed the case of a 16 yr. old girl named "A.K." who was having surgery to control seizures due to epilepsy. During surgery, the doctors electrically stimulated A.K.'s cerebral cortex to map her brain. Mapping of the brain is done to determine the function of different brain areas and to make sure that brain tissue that will be removed does not have an important function.

The doctors found that A.K. always laughed when they stimulated a small 2 cm by 2 cm area on her left superior frontal gyrus (part of the frontal lobe of the brain). This brain area is part of the supplementary motor area. Unlike laughter that happens after brain damage, the laughter that was produced by electrical stimulation in A.K. also had a sense of "merriment or mirth". Also, A.K. did NOT have the type of epilepsy with gelastic seizures. Each time her brain was stimulated, A.K. laughed and said that something was funny. The thing that she said caused her to laugh was different each time. A.K. laughed first, then made up a story that was funny to her. Most people first know what is funny, then they laugh.


(Neuroscience for Kids: Laughter And The Brain, http://fc.units.it/ppb/neurobiol/Neuroscienze%20per%20tutti/laugh.html)

What’s happening here? The brain, manipulation of the brain, and any reports of experiences arising in response to neural manipulation are all subjectively perceived by something subjectively perceiving. The subjective perception of this perceiver is the only thing known with certainty to actually exist. Thus the central question of the nature of existence is whether or not the nonexistence of the subjective perceiver, and all subjective perceivers, leaves behind the objects perceived by the former perceiver in non-mental form.

Secular mythology of the nature of death sets the standard. If consciousness becomes as real as Santa Claus or the Easter Bunny once electrical activity in the cerebral cortex ceases, the mythology must hold that there is a phenomenal (subjectively experienced) brain and subjectively experienced subject, and a non-mental counterpart to the brain and the subject that is a wholly distinct existence from their subjective twins, which are aspects of a particular perceiver.

We know the perceiver exists. Delusions aside, reality manifests at least as a “moving camera point-of-view” that is aware and perceives. We do not know if non-mental analogs of that which the “camera” observes exist, and it is beyond logic that non-mentality must exist in order for there to be mentality. As two distinct existences (with one easily capable of existing without the other), it is inconceivable why one should depend or need the other to exist.

In the end, there is no logic behind the emergence of subjective experience from neurons. One would have to postulate interdimensional portals or find refuge in creation ex nihilo. Even if one accepts or takes into account panpsychism or panprotopsychism, one must hold to a magical “non-touching” connection between the mental and non-mental. Despite secular complaint of the use of magical thinking in religious or spiritual explanation about the world and how it works, this magical thinking exists in the notion that neurons, cells through and through, somehow “vomit” subjective experience—something that is definitely not a biological cell. This magical thinking wears the sheep’s clothing of “reason over faith”, but upon close examination one finds it is imagination that borrows the actual content of visual perception to create a false "actuality" for itself, backed by willful ignorance and denial of the obvious duality and incompatibility between electrified bio-cell and subjective experience.

Tuesday 5 January 2010

Welcome to our Blog !!

Hello world! We’re excited to launch our first official Blog for the ZeWeX Development Center. We are developers, testers, user assistance managers, computer engineers, product marketers and executives who offer different perspectives but share the same passion for Internet and Technology. We hope this Blog provides you with a deeper insight into our work and who we are, and we’re very excited to be out here talking to you !!

Keep on the lookout for upcoming posts from our team members this week and beyond.

Thanks so much for visiting our blog. Enjoy !!