Katoya Palmer Public relation, marketing, sales, and event management consulting.
A Seattle based boutique firm specializing in sales, marketing, public relations, event management / production, and advertising projects. It's my Business to appropriately facilitate your project of any scale with trust, professionalism, and a positive energy while implementing new tools for professional success. "Think out the box" with TBC creative direction, nurture a trusting bond, and reap long term results.
Offering Consulting and Project Management Services in the following: Sales Marketing Public Relations Image Consultant Social Media Management Professional Blogging Event Planning and Management Fundraising Writer (Press Releases, Reviews, Bios, etc) Business Development Concert and After Party Business Events Celebrity Booking
Specialties: Meetings facilitation, marketing strategy, personal events, public relations, community relations, concerts, celebrity booking, fundraiser, branding, reputation management.
Celebrity Booking projects: Jean Grae, Jagged Edge, Amber Rose, Rick Ross, Wale, Tasha Jones, Black Ice, Jagged Edge, Gyptian.
Direct booking responsibilities for Black Stax (Jace Ecaj, Silas Blak, Felicia Loud) and the Klyntel band.
Anyone building a business, listen up: Stanford is offering 16 free, online courses for anyone looking to pick up a few extra skills. The courses include some technology and entrepreneurship-based subjects that could help you get that edge you need.
Stanford University is the institution for entrepreneurship. In its history, the university spit out notable alumni such as Vint Cerf, now vice president and chief Internet evangelist at Google, Google co-founders Sergey Brin and Larry Page, William Hewlett and David Packard of HP, and a number of recognizable technology enthusiasts.
It encourages students to start businesses, and offers courses to that end. Now it’s offering 16 free courses with focuses in business, entrepreneurship, technology, and science. But you don’t have to be a full-time student at Stanford to take advantage. The university says, “the courses are open to anyone with a computer, anywhere.”
For nine of these courses, Stanford is using Coursera, which partners with universities across the country to organize and launch free courses. It thus far has 16 participating universities, in addition to $16 million in its first round of funding. “Writing in the Sciences” and “Human-Computer Interaction” are two of Stanford’s courses being hosted by the startup.
There are a number of courses that could come in handy for someone trying to start a business in the Valley. Here’s a list of what’s coming up this fall:
Machine Learning with Professor Andrew Ng, starting August 20
Cryptography with Professor Dan Boneh, starting August 27
Introduction to Mathematical Thinking with Professor Keith Devlin, startingSeptember 17
Probabilistic Graphical Models with Professor Daphne Koller, starting September 24
Human-Computer Interaction with Professor Scott Klemmer, starting September 24
Introduction to Logic with Professor Michael Genesereth, starting September 24
Organizational Analysis with Professor Dan McFarland, starting September 24
Writing in the Sciences with Professor Kristin Sainani, starting September 24
Algorithms: Design and Analysis, Part 2 with Professor Tim Roughgarden starting inOctober
Technology Entrepreneurship with Professor Chuck Eesley, starting in the fall
A Crash Course on Creativity with Professor Tina Seelig, starting in the fall
Designing a New Learning Environment with Professor Paul Kim, starting in the fall
Finance with Professor Kay Giesecke, starting in the fall
Startup Boards: Advanced Entrepreneurship with Professor Clint Korver, starting in the fall
Solar Cells, Fuel Cells and Batteries with Professor Bruce Clemens, starting October 8
An Introduction to Computer Networks with Professors Nick McKeown and Philip Levis, starting October 8
How Alan Turing set the rules for computing
The Turing Machine gave the world a model for how computers could operateJoab Jackson
June 22, 2012 (IDG News Service)
On Saturday, British mathematician Alan Turing would have turned 100 years old. It is barely fathomable to think that none of the computing power surrounding us today was around when he was born.
But without Turing’s work, computers as we know them today simply would not exist, Robert Kahn, co-inventor of the TCP/IP protocols that run the Internet, said in an interview. Absent Turing, “the computing trajectory would have been entirely different, or at least delayed,” he said.
For while the idea of a programmable computer has been around since at least 1837 — when English mathematician Charles Babbage formulated the idea of his analytical engine — Turing was the first to do the difficult work of mapping out the physics of how the digital universe would operate. And he did it using a single (theoretical) strip of infinite tape.
“Turing is so fundamental to so much of computer science that it is hard to do anything with computers that isn’t some way influenced by his work,” said Eric Brown, who was a member of the IBM team that built the “Jeopardy”-winning Watson supercomputer.
A polymath of the highest order, Turing left a list of achievements stretching far beyond the realm of computer science. During World War II, he was instrumental in cracking German encrypted messages, allowing the British to anticipate Germany’s actions and ultimately help win the war. Using his mathematical chops, he also developed ideas in the field of non-linear biological theory, which paved the way for chaos and complexity theories. And to a lesser extent he is known for his sad demise, an apparent suicide after being persecuted by the British government for his homosexuality.
But it may be computer science where his legacy will be the most strongly felt. Last week, the Association of Computing Machinery held a two-day celebration of Turing, with the computer field’s biggest luminaries — Vint Cerf, Ken Thompson, Alan C. Key — paying tribute to the man and his work.
Turing was not alone in thinking about computers in the early part of the past century. Mathematicians had been thinking about computable functions for some time. Turing drew from colleagues’ work at Princeton University during the 1930s. There, Alonzo Church was defining Lambda calculus (which later formed the basis of the Lisp programming language). And Kurt GAPdel worked on the incompleteness theory and recursive function theory. Turing employed the work of both mathematicians to create a conceptual computing machine.
His 1936 paper described what would later become known as the Turing Machine, or a-machine as he called it. In the paper, he described a theoretical operation that used an infinitely long piece of tape containing a series of symbols. A machine head could read the symbols on the tape as well as add its own symbols. It could move about to different parts of the tape, one symbol at a time.
“The Turing machine gave some ideas about what computation was, what it would mean to have a program,” said James Hendler, a professor of computer science at the Rensselaer Polytechnic Institute and one of the instrumental researchers of the semantic Web. “Other people were thinking along similar lines, but Turing really put it in a formal perspective, where you could prove things about it.”
On its own, a Turing Machine could never be implemented. For one, “infinite tapes are hard to come by,” Kahn joked. But the concept proved invaluable for the ideas it introduced into the world. “Based on the logic of what was in the machine, Turing showed that any computable function could be calculated,” Kahn said.
Today’s computers, of course, use binary logic. A computer program can be thought of as an algorithm or set of algorithms that a compiler converts into a series of 1’s and 0’s. In essence, they operate exactly like the Turing Machine, absent the tape.
“It is generally accepted that the Turing Machine concept can be used to model anything a digital computer can do,” explained Chrisila Pettey, who heads the Department of Computer Science at Middle Tennessee State University.
Thanks to Turing, “any algorithm that manipulates a finite set of symbols is considered a computational procedure,” Pettey said in an interview via email.
Conversely, anything that cannot be modeled in a Turing Machine could not run on a computer, which is vital information for software design. “If you know that your problem is intractable, and you don’t have an exponential amount of time to wait for an answer, then you’d better focus on figuring out a way to find an acceptable alternative instead of wasting time trying to find the actual answer,” Pettey said.
“It’s not that computer scientists sit around proving things with Turing Machines, or even that we use Turing Machines to solve problems,” Pettey said. “It’s that how Turing Machines were used to classify problems has had a profound influence on how computer scientists approach problem solving.”
At the time Turing sketched out his ideas, the world had plenty of pretty sophisticated adding machines that would allow someone to perform simple calculations. What Turing offered was the idea of a general-purpose programmable machine. “You would give it a program and it would do what the program specified,” Kahn explained.
In the next decade, another polymath, John von Neumann, at the Princeton Institute for Advanced Study, started working on an operational computer that borrowed from Turing’s idea, except it would use random access memory instead of infinite tape to hold the data and operational programs. Called MANIAC (Mathematical Analyzer, Numerator, Integrator, and Computer), it was among the first modern computers ever built and was operational in 1952. MANIAC used what is now called the Von Neumann architecture, the model for all computers today.
Returning to Britain after his time at Princeton, Turing worked on another project to build a computer that used these concepts, called the Automatic Computing Engine (ACE), and pioneered the idea of a stored memory machine, which would become a vital part of the Von Neumann architecture.
As well as sparking the field of computer science, the impact his work had on cracking encryption may ultimately have also saved Great Britain from becoming a German colony. People have argued that Turing’s work defining computers was essential to his success in breaking the encryption generated by Germany’s Enigma machine — work that helped bring World War II to an end.
“By today’s definitions, the Enigma was an analog computer. What he [and his team] built was much closer to [the operations] of a digital computer,” Rensselaer’s Hendler explained. “Essentially he showed the power of digital computing in attacking this analog problem. This really changed the whole way that the field thought about what computers could do.”
Having defined computational operations, Turing went on to play a fundamental role in defining artificial intelligence — or computer intelligence that mimics human thinking. In 1950, he authoreda paper that offered a way to determine if a computer possessed human intelligence. The test involves a person having an extended conversation with two hidden entities, a computer and a man pretending to be a woman. (“In both cases he wanted pretending,” Hendler explained.) If the person can’t determine which party is the computer, the machine can be said to think like a human.
“He wanted to put human and computing on equal footing,” Hendler said. “Language is a critical skill for humans because it requires understanding and context. If a computer showed that level of understanding then you wouldn’t notice the difference.”
The test “has the advantage of drawing a fairly sharp line between the physical and the intellectual capacities of a man,” Turing wrote in the original paper.
As IBM’s Brown noted, Turing’s legacy is still strongly felt today. In his mathematics work, he showed that “there exists problems that no decision process could answer,” Hendler said. In terms of computers, this means, “You could never prove for all complicated computer programs that they are correct,” Hendler said. “You could never write a computer program that could debug all other computer programs.”
But far from restricting progress of computer science, the knowledge of such inconclusiveness paved the way for building previously unimagined technologies. It allowed engineers to create immensely helpful services such as Internet search engines, despite knowing that the answers such services were to provide would not always be complete.
“You have people who say we should never build a computing system unless we can prove it is secure. Those of us who understand Turing say, ‘Well, you can’t.’ So you must start proving some approximation of secure, which starts a very different conversation,” Hendler said.
And despite numerous attempts to beat the Turing Test, it still hasn’t been done, except within the most limited of topics. That means we will likely be working to meet Turing’s benchmarks for years to come.
“You can’t say, ‘Siri. How are you today?’ and expect it to go on from there in any interesting way,” Hendler said.
National Hackathon competitions to unite the startup world and bring investment to great ideas. Next hack is June 23-24 in SF, NYC, SEA, & Boston