scope labs
spacer
scope labs
Glossary

 

Empirical

2 entries found.

#Main Entry: em·pir·i·cal Pronunciation: \-i-kəl\ Variant(s): also em·pir·ic \-ik\ Function: adjective Date: 1569

1 : originating in or based on observation or experience <empirical data> 2 : relying on experience or observation alone often without due regard for system and theory <an empirical basis for the theory> 3 : capable of being verified or disproved by observation or experiment <empirical laws>

 

 

 

Premise

In discourse and logic, a premise is a claim that is a reason (or element of a set of reasons) for, or objection against, some other claim. In other words, it is a statement presumed true within the context of an argument toward a conclusion. Premises are sometimes stated explicitly by way of disambiguation or for emphasis, but more often they are left tacitly understood as being obvious or self-evident ("it goes without saying"), or not conducive to succinct discourse. For example, in the argument

Socrates is mortal, since all men are

it is evident that a tacitly understood claim is that Socrates is a man. The fully expressed reasoning is thus:

Since all men are mortal and Socrates is a man, it follows that Socrates is mortal.

In this example, the first two independent clauses preceding the comma (namely, "all men are mortal" and "Socrates is a man") are the premises, while "Socrates is mortal" is the conclusion.

In the context of ordinary argumentation, the rational acceptability of a disputed conclusion depends on both the truth of the premises and the soundness of the reasoning from the premises to the conclusion.

 

-------------------------------------------------------------------------------------

 

Pascal's Triangle

We can use Pascal's Triangle to help us find the probability of multiple events when there are only two outcomes. Each side of the table represents one of two choices, and the numbers from left to right signify the number of true outcomes for those choices. Each row is labeled with its stage in the experiment. The total of the numbers added across one row gives the sample space for that experiment.

EXAMPLE

What is the probability in a family with five children that all five are girls? We look at the fifth row (5 children) of the chart. We label from the left side 5 girls, 4 girls, 3 girls, etc across the row to zero girls. We choose the first number in the row and divide this number by the total outcomes, found from adding up the numbers across the entire row. Thus P(5 girls) = 1/32.

EXAMPLE

What is the probability of tossing four coins and getting the following results? We look at the row four and label the numbers no heads, one head, two heads, etc across the row.

P(3 heads) = 4/16
P(2 heads and 2 tails) = 6/16
P(no heads) = 1/16
P(more than 2 heads) = P(3 or 4 heads) = 4/16 + 1/16 = 5/16

 

 

--------------------------------------------------------------------------------------------------------

 

Deduction & Induction

Deductive and Inductive Thinking

In logic, we often refer to the two broad methods of reasoning as the deductive and inductive approaches.

Deductive reasoning works from the more general to the more specific. Sometimes this is informally called a "top-down" approach. We might begin with thinking up a theory about our topic of interest. We then narrow that down into more specific hypotheses that we can test. We narrow down even further when we collect observations to address the hypotheses. This ultimately leads us to be able to test the hypotheses with specific data -- a confirmation (or not) of our original theories.

Inductive reasoning works the other way, moving from specific observations to broader generalizations and theories. Informally, we sometimes call this a "bottom up" approach (please note that it's "bottom up" and not "bottoms up" which is the kind of thing the bartender says to customers when he's trying to close for the night!). In inductive reasoning, we begin with specific observations and measures, begin to detect patterns and regularities, formulate some tentative hypotheses that we can explore, and finally end up developing some general conclusions or theories.

These two methods of reasoning have a very different "feel" to them when you're conducting research. Inductive reasoning, by its very nature, is more open-ended and exploratory, especially at the beginning. Deductive reasoning is more narrow in nature and is concerned with testing or confirming hypotheses. Even though a particular study may look like it's purely deductive (e.g., an experiment designed to test the hypothesized effects of some treatment on some outcome), most social research involves both inductive and deductive reasoning processes at some time in the project. In fact, it doesn't take a rocket scientist to see that we could assemble the two graphs above into a single circular one that continually cycles from theories down to observations and back up again to theories. Even in the most constrained experiment, the researchers may observe patterns in the data that lead them to develop new theories.

 

 

Entropy

Ice melting - a classic example of entropy increasing described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.

Ice melting - a classic example of entropy increasing[1] described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.[2]

Entropy articles

Introduction

History

Classical

Statistical

In thermodynamics (a branch of physics), entropy, symbolized by S,[3] is a measure of the unavailability of a system’s energy to do work.[4][5]

It is a measure of the randomness of molecules in a system and is central to the second law of thermodynamics and the fundamental thermodynamic relation, which deal with physical processes and whether they occur spontaneously. Spontaneous changes, in isolated systems, occur with an increase in entropy. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed.

The word "entropy" is derived from the Greek εντροπία "a turning toward" (εν- "in" + τροπή "a turning").[

 

Causality

From Wikipedia, the free encyclopedia

  (Redirected from Cause)

Jump to: navigation, search

The Illustrated Sutra of Cause and Effect. 8th century, Japan

Causality denotes a necessary relationship between one event (called cause) and another event (called effect) which is the direct consequence (result) of the first.[1]

While this informal understanding will suffice in everyday use, the philosophical analysis of causality has proven difficult. The work of philosophers to understand causality and how best to characterize it extends over millennia. In the western philosophical tradition explicit discussion stretches back at least as far as Aristotle, and the topic remains a staple in contemporary philosophy journals. Though cause and effect are typically related to events, other candidates include processes, properties, variables, facts, and states of affairs; which of these comprise the correct causal relata, and how best to characterize the nature of the relationship between them, has as yet no universally accepted answer, and remains under discussion.

According to Sowa (2000),[2] up until the twentieth century, three assumptions described by Max Born in 1949 were dominant in the definition of causality:

  1. "Causality postulates that there are laws by which the occurrence of an entity B of a certain class depends on the occurrence of an entity A of another class, where the word entity means any physical object, phenomenon, situation, or event. A is called the cause, B the effect.
  2. "Antecedence postulates that the cause must be prior to, or at least simultaneous with, the effect.
  3. "Contiguity postulates that cause and effect must be in spatial contact or connected by a chain of intermediate things in contact." (Born, 1949, as cited in Sowa, 2000)

However, according to Sowa (2000), "relativity and quantum mechanics have forced physicists to abandon these assumptions as exact statements of what happens at the most fundamental levels, but they remain valid at the level of human experience."[2]

Contents

[hide]

[edit] History

[edit] Western philosophy

[edit] Aristotle

In his Posterior Analytics and Metaphysics, Aristotle wrote, "All causes are beginnings..."[3], "... we have scientific knowledge when we know the cause..."[4], and "... to know a thing's nature is to know the reason why it is..."[5] This formulation set the guidelines for subsequent causal theories by specifying the number, nature, principles, elements, varieties, order of causes as well as the modes of causation. Aristotle's account of the causes of things is a comprehensive model.

Aristotle's theory enumerates the possible causes which fall into several wide groups, amounting to the ways the question "why" may be answered; namely, by reference to the material worked upon (as by an artisan) or what might be called the substratum; to the essence, i.e., the pattern, the form, or the structure by reference to which the "matter" or "substratum" is to be worked; to the primary moving agent of change or the agent and its action; and to the goal, the plan, the end, or the good that the figurative artisan intended to obtain. As a result, the major kinds of causes come under the following divisions:

  • The material cause is that "raw material" from which a thing is produced as from its parts, constituents, substratum, or materials. This rubric limits the explanation of cause to the parts (the factors, elements, constituents, ingredients) forming the whole (the system, structure, compound, complex, composite, or combination) (the part-whole causation).
  • The formal cause tells us what, by analogy to the plans of an artisan, a thing is intended and planned to be. Any thing is thought to be determined by its definition, form (mold), pattern, essence, whole, synthesis, or archetype. This analysis embraces the account of causes in terms of fundamental principles or general laws, as the intended whole (macrostructure) is the cause that explains the production of its parts (the whole-part causation).
  • The efficient cause is that external entity from which the change or the ending of the change first starts. It identifies 'what makes of what is made and what causes change of what is changed' and so suggests all sorts of agents, nonliving or living, acting as the sources of change or movement or rest. Representing the current understanding of causality as the relation of cause and effect, this analysis covers the modern definitions of "cause" as either the agent, agency, particular causal events, or the relevant causal states of affairs.
  • The final cause is that for the sake of which a thing exists, or is done - including both purposeful and instrumental actions. The final cause, or telos, is the purpose, or end, that something is supposed to serve; or it is that from which, and that to which, the change is. This analysis also covers modern ideas of mental causation involving such psychological causes as volition, need, motivation, or motives; rational, irrational, ethical - all that gives purpose to behavior.

Additionally, things can be causes of one another, reciprocally causing each other, as hard work causes fitness, and vice versa - although not in the same way or by means of the same function: the one is as the beginning of change, the other is as its goal. (Thus Aristotle first suggested a reciprocal or circular causality - as a relation of mutual dependence, action, or influence of cause and effect.) Also; Aristotle indicated that the same thing can be the cause of contrary effects - as its presence and absence may result in different outcomes. In speaking thus he formulated what currently is ordinarily termed a "causal factor," e.g., atmospheric pressure as it affects chemical or physical reactions.

Aristotle marked two modes of causation: proper (prior) causation and accidental (chance) causation. All causes, proper and incidental, can be spoken as potential or as actual, particular or generic. The same language refers to the effects of causes; so that generic effects assigned to generic causes, particular effects to particular causes, and operating causes to actual effects. It is also essential that ontological causality does not suggest the temporal relation of before and after - between the cause and the effect; that spontaneity (in nature) and chance (in the sphere of moral actions) are among the causes of effects belonging to the efficient causation, and that no incidental, spontaneous, or chance cause can be prior to a proper, real, or underlying cause per se.

All investigations of causality coming later in history will consist in imposing a favorite hierarchy on the order (priority) of causes; such as "final > efficient > material > formal" (Aquinas), or in restricting all causality to the material and efficient causes or, to the efficient causality (deterministic or chance), or just to regular sequences and correlations of natural phenomena (the natural sciences describing how things happen rather than asking why they happen)..

[edit] Causality, determinism, and existentialism

Causality has taken many journeys in the minds of men for over 3000 years.[6] Determinism and existentialism are but a few of the manifestations of this journey.

The deterministic world-view is one in which the universe is no more than a chain of events following one after another according to the law of cause and effect. To hold this worldview, as an incompatibilist, there is no such thing as "free will". However, compatibilists argue that determinism is compatible with, or even necessary for, free will.

Learning to bear the burden of a meaningless universe, and justify one's own existence, is the first step toward becoming the "Übermensch" (English: "overman" or "superman") that Nietzsche speaks of extensively in his philosophical writings.

Existentialists have suggested that people have the courage to accept that while no meaning has been designed in the universe, we each can provide a meaning for ourselves.

Though philosophers have pointed out the difficulties in establishing theories of the validity of causal relations, there is yet the plausible example of causation afforded daily which is our own ability to be the cause of events. This concept of causation does not prevent seeing ourselves as moral agents.

Paranoia

Paranoia is a disturbed thought process characterized by excessive anxiety or fear, often to the point of irrationality and delusion. Paranoid thinking typically includes persecutory beliefs concerning a perceived threat. In the original Greek, παράνοια (paranoia) simply means madness (para = outside; nous = mind). Historically, this characterization was used to describe any delusional state.

Sometimes in common usage, the term paranoia is misused to describe a phobia. For example, a person may not want to fly out of fear the plane may crash. This does not in itself indicate paranoia, but rather a phobia. The lack of blame in this case usually points to the latter. An example of paranoia, however, would be fear that the pilot is an alcoholic with no evidence to suggest such, and would crash the plane as a result of this.

Contents

[hide]

[edit] Use in psychiatry

More recently[1], the clinical use of the term has been used to describe delusions where the affected person believes he is being persecuted. Specifically, they have been defined as containing two central elements:

  1. The individual thinks that harm is occurring, or is going to occur, to him or her.
  2. The individual thinks that the persecutor has the intention to cause harm.

Paranoia is often associated with psychotic illnesses, sometimes schizophrenia, although attenuated features may be present in other primarily non-psychotic diagnoses, such as paranoid personality disorder and obsessive compulsive disorder. Paranoia can also be a side effect of medication or recreational drugs such as marijuana and particularly stimulants such as methamphetamine and crack cocaine. In the unrestricted use of the term, common paranoid delusions can include the belief that the person is being followed, poisoned or loved at a distance (often by a media figure or important person, a delusion known as erotomania or de Clerambault syndrome). Other common paranoid delusions include the belief that the person has an imaginary disease or parasitic infection (delusional parasitosis); that the person is on a special


Shadenfreud: The pleasure derived from the misfortunes of others.

 
 
 
blog
February 25, 2009
DEFINITION OF "INVESTMENT"

I couldn’t believe my eyes when I read the actual definition of "investing" it appears if you are blind, deaf and dumb you qualify!!! It literally says that you are to take financial "risk" with "hope” as a strategy and do zero "work" with regards to effort...
READ MORE
February 13, 2009
PETER PRINCIPLE BUBBLE
Granted, I have NEVER heard of him all my life, but what he SAYS is amazing to extent that he just throw hands in air and says he wanted some discontent lady ...
READ MORE
February 11, 2009
IT'S ALL FAKE
Since I have SCREAMED to anyone that knows me that - - " its ALL FAKE and has ALWAYS been FAKE "- - the "stock market"...
READ MORE
January 22, 2009
THE NEXT ONE

Inasmuch as I don't have TV and have never actually registered to vote I did hear we have a new president ...
READ MORE
scope labs   scope labs