Pages

Tampilkan postingan dengan label to. Tampilkan semua postingan
Tampilkan postingan dengan label to. Tampilkan semua postingan

Sabtu, 12 April 2014

Hurt Locker P2P Lawsuits Move On To Canada

By now, US torrent users are used to the nagging worry that a copyright holder could seek damages against them. Now these mass lawsuits appear to be making the journey to Canada, where Voltage pictures is seeking the identities of users they claim have pirated the film Hurt Locker. Major ISPs have been subpoenaed, but the number of defendants is not yet available.

Back in early 2010, Voltage Pictures, the makers of the Oscar-winning film Hurt Locker, began suing to uncover the people behind thousands of IP addresses. Voltage Pictures contends that these IPs were spotted downloading the film via torrents. In most cases, judges granted the subpoenas and ISPs had to divulge customer details. Instead of suing outright, Voltage had its legal counsel extract settlements from the defendants.

Now the same scheme is playing out up north. A court in Montreal gave the thumbs up to Voltage and its legal counsel two weeks ago. Today is the deadline for three Canadian ISPs to hand over the records. It’s unclear if the same settlement shakedown is going to happen in Canada, but we have to imagine it will.
Read More..

Selasa, 01 April 2014

Adobe’s controversial decision to dump its Flash plugin for mobiles

Adobe’s controversial decision to dump its Flash plugin for mobiles
Adobe’s controversial decision to dump its Flash plugin for mobiles
Web developers are angry with Adobe.

Emotions are running high in the world of professional web design and development, with words such as “shambles” and “betrayal” in common use. This anger is directed at Adobe’s controversial decision to dump its Flash plugin for mobiles. Fellow RWC columnist Kevin Partner summed up these feelings in a recent blog post:

“The irony is that it isn’t Steve Jobs’ famous hatred of Flash that has caused this turnaround – the true villain of the piece is Adobe itself. By abandoning development of Flash for mobile, it eliminates Flash as an option for most websites... Farewell Adobe. Delete.”

After years as a Flash-based developer, Kevin will save himself a lot of money by ditching it for the open standards HTML, CSS and JavaScript. This is serious stuff: designers and developers who built careers around Flash have had their skills rendered worthless, and livelihoods are on the line. Unlike Adobe, these folk still believe in Flash.

Certainly, Flash can be abused in irritating banner ads; it annoyingly asks to be updated twice a week; and (like JavaScript) can crash the machine if poorly coded. However, the widespread prejudice that it’s an unnecessary hindrance to the smooth running of a browser-based web is wrong. Flash has enriched the web enormously with vector graphics, bitmaps, audio, video, interactivity, communications, programmability and now even 3D – today’s web would be unrecognisably poorer without it, and so will that of tomorrow.

The fact that Flash delivers all of this functionality via a browser plugin is actually its greatest strength. Adobe’s Allan Padgett discovered how to make Acrobat Reader render PDFs directly inside Netscape, showed this to Jim Clark and the Netscape Plugin Application Programming Interface (NPAPI) was born, enabling browsers to reserve onscreen space for content rendered by any compliant plugin.

Soon all browsers supported the cross-platform NPAPI, and player-based delivery became the norm: add a new function to the plugin and it’s immediately available to all web users regardless of CPU, OS and browser (and even backwards-compatible with the oldest NPAPI-compliant browsers). Contrast this with the glacial pace of HTML/CSS/JavaScript development, where designers deploy any new capability only after the slowest browsers and users finally caught up.

Ironically, the biggest beneficiary of this plugin revolution wasn’t Adobe, with its ability to render PDFs in the browser, but Macromedia, which could render Shockwave Flash (SWF). The reason that isn’t widely understood: NPAPI doesn’t only support rendering chunks of static content, but can also stream content through a persistent connection.

Macromedia broke free from HTML’s static page-based handling and brought the web to life with streamed content and, better still, such content was automatically protected because it was rendered on the fly and couldn’t be saved. Flash became central to professional web design and the natural extension for HTML. Usage exploded and the Flash player became the one plugin everyone installed. With penetration approaching 100%, developers could assume its presence – almost unnoticed it became “the world’s most pervasive software platform”, with greater reach than any individual browser or operating system.

Macromedia realised that it owned the universal online runtime, and decided to bring to the web the sort of interactive computing experience you could only then get on a desktop PC. In 2002, it released a white paper by Jeremy Allaire that floated the idea of the Rich Internet Application (RIA). Flash would no longer merely extend HTML pages by embedding multimedia content: the player would become the “rich client” supporting standalone, browser-hosted applications that enabled users to do stuff, not just see stuff.

A RIA could be anything from flipping an online magazine page to a virtual shopping mall, from videoconferencing to word processing. This up-shift from add-on to rich client was a major undertaking, and extending the web into a ubiquitous computing platform would step on some important and sensitive toes. Macromedia needed far more serious backing and, after failed talks with Microsoft, in 2005 it was acquired by long-standing rival Adobe.

Adobe extended Flash’s capabilities into its Creative Suite, the open source Flex framework and Flash Builder IDE. In 2008 it launched the Adobe Integrated Runtime (AIR), which made it possible to run Flash-based RIAs offline on a desktop PC.

Nothing but the web

After Adobe, the company most interested in making the web into a universal computing platform was Google. The ability to work well with plugins with improved security, standardised rendering and separate execution was central to its own Chrome browser – features that Google made available to other browser developers via its Pepper Plugin API (PPAPI). It even merged the Flash player directly into the Chrome runtime. The web itself was going to become everyone’s computing platform, and current heavyweight operating systems such as Windows and Mac OS would effectively become redundant; with data and applications handled in the cloud, the OS was needed only for loading the browser and the rich client.

In late 2009, Google announced plans for Chrome OS – a stripped-down, web-only, cloud-focused operating system aimed at netbooks, desktop PCs and a new class of handheld, touchscreen devices called tablets. By the time Google finally released its promised “Chromebook” in June 2011, response was muted. The advantages were clear enough: low-cost and maintenance, fast boot-up, security, and ubiquitous access to your cloud-based content that was easy to share with collaborators and was automatically backed up. The problems were just as clear: who in their right mind would choose ugly-looking and underpowered web applications over smooth, fast native desktop apps?

But this is to miss the significance of that rich client: if I were to use a Chromebook, I wouldn’t use an HTML-based application such as Google Docs; I’d use a Flash-based RIA such as Adobe’s own free Acrobat.com suite (Buzzword word processor, Presentation graphics, Table spreadsheet, Forms Central and ConnectNow web conferencing). Acrobat.com is far from perfect, but it demonstrates the potential for streaming advanced applications. But what about applications with serious data and processing requirements? Surely you can’t do something such as an online Photoshop in Flash? Well, yes and no: check out the Photoshop Express Editor from Photoshop.com (it’s free) and you’ll find it surprisingly powerful for consumer-level photo editing. I can readily imagine a future version becoming Pro capable.

Never forget that your local device doesn’t have to do all the processing; all the heavy lifting can be handled remotely, and all the rich client need do is to stream onscreen activity from that server via a live connection. Do you really believe that your PC can render a 3D animation faster than Google’s, or Adobe’s, or any other cloud-based provider’s server farm? This idea of Software-as-a-Service (SaaS), where application providers carefully balance server-based number crunching against rich client-side rendering was the long-term dream of the rich cloud. Even the most demanding supercomputer applications are thinkable as RIAs.

The web needn’t be a lowest-common-denominator experience, which is the case with current HTML and JavaScript-based web apps. The fundamental difference between a thin client and a rich client is that the latter promises the best of both worlds: server-based processing power delivered via a lightweight, design-rich front-end. It makes as much sense to share multimedia data, applications and processing power via the rich cloud as it did to share paged static information via HTML.

This revolution looked unstoppable, because another great advantage of plugin-based computing is that it’s immune to sabotage by competitors. When Netscape and Java first raised the possibility of the web as a platform, Microsoft famously responded with Internet Explorer and its strategy of “embrace, extend and extinguish”. That wasn’t possible with Flash because Microsoft couldn’t do anything to break Adobe’s self-contained, universal plugin – if it couldn’t stop Flash, its only option was to provide a superior alternative.

The result was Silverlight, a beautiful system built around the truly rich (and open!) mark-up language XAML, which enabled Microsoft’s army of Windows desktop developers to seamlessly translate their .NET programming skills into the development of powerful RIAs for cross-platform, browser-based delivery. Forget about Acrobat.com and imagine a cloud-based, Silverlight-delivered version of Office, and SaaS versions of every other Windows application. Silverlight revealed another strength of plugin-based delivery: truly open competition. Content producers could choose between Flash or Silverlight, according to their respective design and development strengths, but their end users didn’t need to make any choices at all, since both plugins (and any others) could happily co-exist.

With two competing rich clients, the stage was set for the web to reach its full potential, and Google, Adobe and Microsoft were all fully signed up to this vision. So were all the other major OEMs such as RIM, Samsung and HTC, via the industry-wide Open Screen Project that committed members to support Adobe’s new Flash mobile player and AIR with all their future mobile devices...

The death of the rich client

One man had other plans. Steve Jobs wasn’t ready to see his mobile devices turn into vehicles for rich cross-platform computing, capable of supplying the same sort of rich content and applications as his own native iOS apps and, hence, depriving his platform of its USP and exclusivity (and his App Store of its 30% margin). In the longer term, why would Jobs want the open web to turn into a rich and open cloud? What was in it for Apple?

Unlike Microsoft, Apple makes computer hardware, so what would be the future for Apple if the cross-platform web became the computing platform of the future? Steve Jobs’ visceral hatred of Flash wasn’t at all because it was “yesterday’s technology”, but precisely because it was “tomorrow’s technology” – and it could destroy his empire. Having watched his first lucrative walled garden demolished by Adobe – when it made its Mac-only publishing and graphics applications available under Windows – Steve Jobs knew that he needed to act fast.

When Jobs posted his carefully crafted “Thoughts on Flash”, and made it clear that the iOS platform and the iPad were never going to support plugins, it must have felt like a sweet revenge. At a stroke he killed off Google’s chances of turning the Chromebook into a serious competitor, and rendered worthless all the money and effort Microsoft had put into developing its cross-platform Silverlight. I can’t help imagining that, as he hit Send, he too may have been saying “Farewell Adobe. Delete.”

But isn’t this getting a teeny-bit paranoid? After all, it’s the man’s right not to support plugins if he doesn’t want to. Why personalise things this way? More to the point, if the future was so much brighter with Flash and Silverlight, why didn’t those other companies just stick with their vision, out-compete Apple and let the free market decide the matter? When even Adobe gave up so quickly, surely that demonstrates that Steve Jobs was right?

I still think of my PC as belonging to me rather than to Steve Ballmer, and I think that all interested parties should be able to – and be encouraged to – make software to run on it, and that it should then be up to me to decide which of that software I run. Flash-blocking by individuals shows freedom of choice in action, but a blanket ban by a device manufacturer shows the exact opposite. This must be one of the most extraordinarily anti-competitive acts in history.

As for the argument about “letting the market decide”, I wish it were that simple. I did hold out hopes that the lack of Flash support may hit Apple commercially, and that it would be forced to rejoin the cross-platform consensus on which the web was built. I even think Adobe might have continued with the mobile Flash player if Microsoft had announced support for both it and Silverlight within Metro. However, I can also understand how Microsoft looked at Apple’s business model and realised that killing off both rich clients in Metro and opening its own app store made far more financial sense.

What about Android? Why didn’t Adobe keep the faith there and at least keep the Flash flame burning? It’s this betrayal that has made developers such as Kevin Partner so angry – but sadly, I think that Adobe was right. When it comes down to it, the web is universal or it’s nothing. If Apple unilaterally decided to drop JPEG support, we’d all have to shift over to GIFs. Now that it’s dropped support for SWF, we have to shift to HTML5 and SVG and the video formats, where we can. Ignoring iOS simply isn’t an option. Killing off Flash in the browser is the last thing Adobe wanted to do, but it’s right to recognise the inevitable, and responsible to take the lead.

The universal cross-platform web plugin model had many strengths, but Jobs uncovered its hidden and catastrophic flaw: as soon as one platform refuses to support it, the plugin and its functionality is immediately rendered useless for everyone. This idea of not supporting any plugins should have been unthinkable, was unthinkable; but once he’d thought of it, Jobs knew that it couldn’t fail.

I recognise that Steve Jobs did far more in his life than ban Flash, but my anger and admiration is indeed personal. There was only one man on the planet who could possibly imagine taking on the rich cloud dream and wrestle the web genie back into the bottle. There was certainly only one man who could pull it off and still be revered as a saint – imagine what would be happening now if Steve Ballmer had announced Microsoft was unilaterally ceasing to support a web technology used by more than two million designers and developers, by more than 80 of the top 100 sites, and more than 99% of end users.

What’s happened has happened: Apple has won and the professional web community has to face the new reality. So is it farewell to Adobe?

There’s no doubt that many thousands of developers, such as Kevin, will decide that now is the time to move on. Ultimately, though, the mission of making both the web and computing experiences as rich, powerful and personal as possible will continue. Without mobile delivery, Flash SWF has limited long-term prospects in the browser, and the cross-platform future belongs to HTML5 online, and AIR offline. Adobe is still best placed to provide the tools (and to use SWF to provide continuity and backwards-compatibility). Adobe may have been mugged and hospitalised, but AIR and HTML5 are a new exciting territory in which I believe the company will survive and thrive.

What has died is a beautifully simple, all-purpose, universal rich web format for extending what HTML pages can do. With the death of both Flash and Silverlight in the browser, we’ve also lost a live, direct and untaxed connection between content producer and content consumer. Why should we pay 30% to Apple, Microsoft and Google for native apps that would actually run far more efficiently in our browser?

The biggest loss of all is that long-term dream of supercomputer cloud power delivered directly to cheap, secure, simple and personal rich clients. We may have been given “the web in our hands”, but we’ve been deprived of the rich cloud.

Read More..

Kamis, 20 Maret 2014

upgrade tp link mr3020 to the latest firmware

-    download the latest firmware from http://www.tp-link.com/en/support/download/?model=TL-MR3020&version=V1#tbl_j 

first plug an ethernet cable to pc and the tp-link mr3020. we use cable because its more reliable than wireless. because upgrade firmware need stability. make sure pc power keep on while doing these action. because if the main process got interupted while updating firmware, it can brick the hardware it self. so using an ups is recommended.



change the ip of your lan. in windows 7 it can be done from start > control panel > network and internet > network and sharing center > adapter setting > local area network > properties > tcp ip v4 and change the ip from 192.168.1.1-X      (X= any number from 1-253)

setting ip

now open your browser and point it to ip 192.168.0.254 you will be prompted by a box asking for login


login page

the default user is admin and password admin


on left side bar u can find system tool > firmware upgrade. choose the firmware that we already download.
when asked are you sure bla bla bla... just press ok.. then wait until the router reboot. after that check the status bar.

main menu



voila... the router just upgraded well. you can see the version changed after we flash the new firmware. thats it
Read More..

Senin, 17 Maret 2014

How to Boost Your Wireless Signal

How to Boost Your Wireless Signal
In general, speed and range issues all can be lumped together as performance issues. You want both your speed and range to be as robust as possible. There are several factors that can impact both aspects of performance.

Distance can certainly impede performance. You may have a room in your home or office that is simply too far from your wireless router. Even the way your home or office is structured could be a culprit in poor wireless performance. If the signals have to bounce around too many corners to reach your wireless devices, that can cause problems (although a technology in newer, premium routers called beamforming, can help direct a routers signal to wireless clients).

Interference with the signal can be big factor in performance, too. If you live in an apartment building, your home might be inundated with signals from everyone elses routers. Maybe structural interference is the culprit. If your furnace, washing machine, and dryer are all between your router and your laptop, that doesnt help.

Maybe its the software youre using. Routers need software updates just like everything else - and sometimes the firmware they initially ship with is improved with a later-released update.

These are just a few of the possible reasons your connection might be poor (or nonexistent). Fortunately, there are many ways to extend your wireless signal, and most of them simply involve a bit of tweaking to your wireless network or adding some affordable components. Well walk you through ten of the most useful fixes for your connectivity woes.

Read the next 10 tips to extend your Wi-Fi signal. Some of the suggestions require no additional hardware or software to purchase, while others may require a small or larger investment, depending on the particular performance problem youre experiencing.

1. Change the channel

Wi-Fi routers operate on specific channels. When you set up a typical router, it usually chooses a certain channel by default. Some routers choose the least-crowded channel, but yours may not have. Check for yourself which Wi-Fi channel is the least crowded to boost the routers performance, perhaps boosting signal range. A good, free tool to use is inSSIDer. Dont be put off by the graphs and excess information. What you want to focus on is the column "Channel." See how many routers in this area are on channel 6 in the slide above? If your router is on the same channel, you want to switch it to a less-crowded one, like 4 or 1. You can change the channel of your router by going into its interface. All routers have different ways to access the interface, so check with your manufacturer.

2. Update router firmware

Updating router firmware is often overlooked by home users. Business networking devices usually display some sort of notification when newer software for the device is available for download. Consumer products such as home wireless routers, especially older routers, dont always offer this notification. Check often for firmware updates for your router. There is typically a section in the routers interface for upgrading the firmware. However, you often have to go the router manufacturers website and search for the firmware (most vendor make searching for firmware pretty easy) and then upload it through the routers interface. Theres often accompanying release notes that tell you what the firmware helps to fix; often the fixes are for connectivity problems.

3. Update adapter firmware

Just like routers, network adapters on PCs and laptops also are subject to firmware updates. Remember, good wireless range and performance is dictated not just by the router but by the network adapter on clients (as well as other factors, but these are the two biggies.) Most laptops have on-board adapters. Go into your Network settings to find the name of the adapter (via Control Panel in Windows OS) and then to that adapters manufacturers site to make sure you have the latest firmware.

4. Change position

Do you have your wireless router nestled up against your broadband modem tucked away in your entertainment center in your basement thats converted into the family den? Move it, if you have range issues. You dont have to have the router in close proximity to your modem. Ideally, a Wi-Fi router should be in a central location. You can purchase custom length Ethernet Cat 5 cable from Best Buy or any place that services computers (although if you do that, this is technically no longer a free options) if you need more flexibility in centrally positioning the router.

5. DD-WRT

For the adventurous; DD-WRT is open-source software for routers. Its known to ramp up router performance and extend the feature set beyond what typically comes with most routers. Not every router supports it, but the number of routers that are supported keeps growing. Warning; installing DD-WRT may quite possibly invalidate your routers warranty. Many manufacturers will not help you troubleshoot router issues once you have DD-WRT on them. Hence, this is not a recommended option for routers under warranty or in a business network. There are also no guarantees that DD-WRT upgrades wont negatively affect a router. However, many users are finding it a free way to trick-out their routers. So, if you have an older, spare router laying around, or want to take the plunge to see if DD-WRT firmware helps your range issues on a newer router, check if its supported on the DD-WRT site. Also note, its not easy to remove DD-WRT from some routers without doing a lot research.

6. Set up a second router as an access point or repeater

You can set up just about any router as a wireless access point. To do so, you need to connect the second routers LAN port to the primary routers LAN port. On the second router, you will want to give it the same addressing information as the primary router. For example, if you primary routers IP address is 192.168.2.1 and its netmask is 255.255.255.0; then you could make the second routers IP 192.168.2.2 and use the same netmask. Its also important that you assign the same SSID and security on the second router and turn DHCP off on the second one as well.

Newer routers make this process easier. If you have a second router thats only about a year old, most of them can be set to operate in "access point" or repeater mode. Configuring is as simple as clicking a button. Check with your routers manufacturer or documentation. You can also just purchase a dedicated access point such as Linksys By Ciscos Wireless-N Access Point with Dual Band WAP610N. This is a more expensive option, but will likely save you some network configuration headaches. Best bet, if you go this route; use an access point from the same manufacturer of your router.

7. Antennas

Newer 802.11n Wi-Fi routers are increasingly coming with internal antennas. There are some that still have or support external ones, and these antennas can often be upgraded. Consider a hi-gain antenna, which you can position so that the Wi-Fi signal goes in the direction you want. Hawking Technology offers the HAI15SC Hi-Gain Wireless Corner Antenna. Though, we have yet to test it; Hawking claims it boosts wireless signal strength from a standard 2dBi to 15dBi. Antennas like these can attach to most routers that have external antennas connectors. Hi-gain or "booster" antennas range in pricing from $40 to $100 dollars.

8. Repeaters/Extenders

Most major wireless networking vendors offer devices that act as repeaters or wireless extenders. While they can extend a Wi-Fi signal, they can be tricky to setup, can cause interference with the signal and can be expensive. A good repeater or extender can you set you back almost $200.

9. New Router/Adapters

How about getting new routers and adapters, altogether? Upgrading your home network to 802.11n and using the 5 GHz band should give noticeable performance improvement. 2.4 GHz is said to actually have greater range than the 5 GHz band, but that only becomes apparent when supplying wireless coverage to large areas such as college campuses or municipalities. In a number of our router testing, for smaller areas, like in a typical home network, 802.11n and the 5 Ghz band kept better throughout than 2.4 GHz with most routers, at greater distances. Its a more expensive option, but if wireless connectivity is crucial for you, its a plausible one. If you go with an 802.11n router, you will of course, need to update client adapters that support "N" as well. USB-based 802.11 N adapters are convenient ways to update a laptop that may have an older on-board adapter.

10. Single Vendor Solution

Vendors are quick to say that their product will work with other vendors products. But it just makes sense: Cisco network adapters will work better with Cisco routers; Belkin adapters work optimally with Belkin routers and etc. If possible, try to limit your network devices to one vendor; that means not only your router or adapter, but antennas, repeaters and access points.
Read More..

Selasa, 04 Maret 2014

The Only Reason to Buy Individual Stocks

I recently wrote about why I no longer buy individual stocks. I will now detail my one exception to this rule. If I was working at Google, Facebook, or owned large amounts of those stocks that I couldnt or didnt want to divest myself of (very common in the case of RSUs that arent vested), then this is what you need to do.

For the rest of your portfolio, when youre buying indices, you can custom construct an index such that those individual stocks are left out. While it only makes sense to do this if your portfolio exceeds a certain amount, the net result is that your overall portfolio is better balanced because you dont have to own Google again, for instance, both in the index and in individual holdings.

Whats the portfolio level at which this makes sense? At one point, the only solution was Parametric Portfolio Associates, at a minimum of $5M. However, today, Wealthfront has dramatically lowered the entry level to $500,000. Once youve got such a custom index built, as Wealthfront has detailed, you can now do tax-loss harvesting at an individual level, bringing your tax loss harvesting potential up even further. This is a huge incentive for high networth individuals to use Wealthfront, as their fees are substantially lower than Parametrics, for instance. In fact, once you build a custom index, you no longer have to pay Vanguards management fee, for instance, so Wealthfronts equivalent portfolio costs come close to Vanguard, and the additional tax-loss harvesting delta will tip the edge in Wealthfronts favor.

In How to Interview a Financial Advisor, I explain why most Googlers who were offered the opportunity turned down Parametrics offer. The problem is that if you fire Parametric, now youre stuck with a portfolio of 499 individuals stocks, and now you have to either manage that portfolio yourself, or find someone willing to do it for you. As you can imagine, unless youre about to write your own software to do this, that effectively makes Parametric/Wealthfront a roach motel, where you can always check in, but you can never leave.

When I spoke with Wealthfront about this latest feature, I proposed a solution to this issue. Its very much technically feasible, and I look forward to seeing if Wealthfront will implement it. The nice thing about being a Silicon Valley startup is that they can move forward far faster than the stodgy banks on Wall Street used to big fat margins.

As for myself? I will continue to move more assets over to Wealthfront as they free up elsewhere. The additional bonuses from what theyre doing at higher asset levels is just too much even for this DIYer to pass up.

Read More..

Minggu, 02 Maret 2014

Day 30 Neuhausen to Kloten


We woke up in the morning to overcast skies and wet roads. Looking at the weather forecast, it looked like rain was in the cards for the next morning as well, so I called Hotel Flyaway and informed them that we would arrive a day early. We ate an anemic breakfast and got on our bikes and headed again for the Rheinfall, since our route would initially be along Swiss bike route #2.


From Tour of the Alps 2011

What a difference 12 hours made! Instead of the beautiful water, now all we saw was gray. Riding past the falls, we quickly found ourselves back in Germany!


From Tour of the Alps 2011

The gray skies continued all the way until we came back into Switzerland, but at last the rain stopped. At Flach we ignored the bike path sign to Eglisau and headed instead towards Embrach and Kloten. Since our GPS units already had Hotel Flyaway set as a way point, we had the units set, but ignored the GPS most of the time since the bike path signs were more reliable.


From Tour of the Alps 2011

Upon leaving the Rhein, the route led relentlessly uphill for an hour or so, but at a gentle grade and without much rain. Even the roads were starting to dry. Once in Embrach, bike path signs for Kloten showed up and we followed that over a hill and found ourselves descending at speed into Kloten, where GPS units and Phils memory led us easily back to the hotel. We checked in, got our bike cases out, and started taking apart the bikes for stowing. The skies were blue when we checked in but as we started wrenching on our bikes rain came down from the sky in sheets of water and thunder shook the skies around us. We had clearly ended the tour right at the nick of time.


From Tour of the Alps 2011

It took us about 45 minutes to pack the bikes, after which we cleaned up, feeling happy about not having to do laundry for a change. Our freshly laundered clothing had arrived safely from St. Moritz, as had our Amazon.de order for maps. We ate a late supermarket lunch, then started shopping for chocolate. Phil had never really spent much time in downtown Zurich, and Cynthia had wanted us to bring back some Lindt Summer Edition chocolate, so we caught a train downtown with a day pass. We took a quick break at the Sprungli cafe downtown for some chocolate cake.


From Tour of the Alps 2011

I was starting to despair at finding summer edition chocolate when I finally found it in the basement of the City Coop near the Sprungli store. Phil and I each bought about $100 worth of chocolate. We then had dinner near the train station and headed back to Hotel Flyaway just in time to see the Double Rainbow.


From Tour of the Alps 2011

We had one more day in Zurich, but our tour was over.
Previous
Next
Read More..

Rabu, 26 Februari 2014

Windows 8 moves to BIOS based product keys

Windows 8 PCs now keep their product keys in the BIOS, a move that offers both pros and cons.

Windows 8 moves to BIOS-based product keys

In the past, a new Windows PC would display its product key on a sticker, usually on the side of a desktop and on the base or the bottom of the battery compartment on a laptop. But with Windows 8, Microsoft has switched gears and now stores the key as a BIOS setting instead.

A response to a question on Microsofts Answers Web site confirmed the details earlier this month:

One of the improvements Microsoft is making to Activation 3.0 for newly built machines that come preloaded with Windows 8, you wont have a COA (Certificate of Authenticity) sticker attached to the machine anymore. Instead, this will be embedded in the BIOS. This will avoid product keys from being compromised and OEMs will buy what they need.

So if you need to reset or reinstall Windows 8, you dont need to hunt for the product key. Its automatically applied and activated. Thats certainly a plus, especially when the numbers on those product key stickers wear out or are just too small to easily read.

Microsoft certainly benefits from this new activation process since a Windows 8 product key embedded on one PC seemingly cant be used on another.

But therein lies the problem for the user.

Lets say you own a new PC running the standard version of Windows 8. And you own a separate retail copy of Windows 8 Pro with its own product key. You then install Windows 8 Pro on your PC. Will Windows insist on using the embedded product key, or is there a workaround so you can manually enter the key that came with your Windows 8 Pro software.

Or, lets say your current Windows 8 PC dies and you want to install the OS on a different computer. How can you do that if the product key is locked to the dead PC?

A couple of people posed those same questions on the Microsoft Answers page but have yet to receive answers.

Enhanced by Zemanta
Read More..

Senin, 24 Februari 2014

How to Interview A Financial Advisor now out in paperback

My latest book, How to Interview a Financial Advisor is now out in paperback. For this book, Im enrolling in the Matchbook program, which means that for $2.99, you can get the digital copy of the book as well as the paperback if you buy the paperback. I loved the Matchbook program, and I was happy to sign up this bookf or it, both as an experiment and also because I feel its a great way to provide a discount: buy a copy for your friend, and keep a copy for yourself!

Read More..

Rabu, 19 Februari 2014

How to Make CROSS STRAIGHT cable

________________________________________
STEP 1: Choose the right cable…
1. To Connect PC to PC Cross Cable.

2. To Connect PC to HUB/SWITCH/ROUTER Straight Cable.

3. To Connect HUB/SWITCH/ROUTER to HUB/SWITCH/ROUTER Straight
Cable

STEP 2: Understanding CAT 5 Cables…

Wires: CAT 5 Cable has 4 pairs of copper wire inside it.

Colors: Standard cables has BROWN, BROWN WHITE, GREEN, GREEN-
WHITE, BLUE, BLUE WHITE, ORANGE, ORANGE WHITE.

STEP 3: Making Straight Cable…

Nomenclature: let us first give a number scheme for cabling which we will
follow throughout this tuto. BROWN (8), BROWN WHITE (7),
GREEN (6), GREEN WHITE (3), BLUE (4), BLUE WHITE (5),
ORANGE (2), ORANGE WHITE (1)

Requirements: Two RJ45 Connectors, Crimping tool & CAT 5 cable of desired
length(less than 250 meters).

STEP 3.1:

There are two standards adopted for Cabling EIA/TIA 568A & EIA/TIA 568B.

When you use single standard (either EIA/TIA 568A or EIA/TIA 568B) on both the end of cable then the resulting cable is STRAIGHT CABLE.

On the other hand if you use different cabling standard on the ends of cable then the resulting cable is CROSS CABLE

I’ll use EIA/TIA 568B standard for creating cross and straight cable

1. Remove the covering of CAT 5 cable.
2. Straighten the eight wires of the cable.
3. Using Crimping tool’s cutter cut the end of wires so that they are of same length
4. Arrange the wire in order 1, 2, 3, 4, 5, 6, 7 & 8 respectively as I have mention or as shown in the diagram.
5. Insert the arranged cable in the RJ45 connector with clip pointing down exactly as shown in the figure.
6. In crimping tool insert the head of RJ45 connector and crimp (press) it hardly.
7. Follow same step with same color order for the other end of cable too.
8. The wire you made by following these steps is a STRAIGHT cable.

STEP 4: Making CROSS Cable…

Of the Eight wires in Cat 5 not all are used for data transfer when using 100Mbps Ethernet card. Only 2 pairs of cable are used i.e. 2 wire for transmitting signal and two wires for receiving signal.

So now you can guess why we have to make CROSS CABLE for connecting same kind of devices. Because if use same color coding on both the side than transmitter of one m/c will send data to transmitter of another and data packets will lost, so we have to change wiring code so that transmitter of one connects to reciver of other and vice-versa.

Here are the Steps:
Steps 1 to 6 are same as for STRAIGHT through cables
7. Only difference is in color coding of other side of wire.
8. Wire that is on 1st number on A-side (one end) should be on 3rd number on B-
side (other side) & vice-versa.
9. Wire that is on 2st number on A-side (one end) should be on 6rd number on B-
side (other side) & vice versa.
10. Now Crimp the RJ45 connector.
11. Your CROSS wire is completed.
Read More..

Jumat, 07 Februari 2014

April 7th Kellys Cove Norman Island to White Bay Jost Van Dyke

From Screen Captures

When I originally set up the trip, I thought that charters were for 7 days (April 1 to April 7th). It turned out that charters were for 7 nights/8 days, which meant that we had one extra day on the 8th. Shauna and Steve, unfortunately didnt get the message, and scheduled their flight for the 8th. Given when we were going to return the boat, I was pretty sure they wouldnt make the ferry from Tortola, so they opted to visit St. John and get dropped off a day early from Jost Van Dyke instead. Given the way things worked, it would have been closer to dropp them off at St. John, but of course, that would have required customs clearing on both ends twice, which would be very annoying and painful. As forecasted, the North swell had ended the night before, so going to Jobst Van Dyke would be an easy sail.
From Escape Catamaran 2012

Except that Id lost track of where St. John was, and we sailed the wrong way for a bit before I realized my error and turned the boat around. In the narrows, we got to gybe often, but near Sophers Hole a storm blew through, giving us 15 knot winds for a while and a thrilling sail as we made it right through the passages and the Thatch Island cut on the wings of a shower-storm that (unfortunately) produced no rainbows. This would be the only rain we saw for the entire trip.
From Escape Catamaran 2012

The wind died mysteriously once at the Thatch Island Cut, however, and I was forced to turn on the motor. It being not even 9am, we made good time and I realized we had the time to visit Sandy Cay for some snorkeling and swimming, so we headed there to the beautiful beach which I had bypassed during my last visit and was determined not to miss this time. Eschewing the mooring buoys, I pulled up close to the beach in 6 feet of water and dropped anchor, putting the boat a short swim from shore.
From Escape Catamaran 2012

The water was cooler, being closer to the Atlantic, but the Cay was too beautiful to pass up. Even Cindy said, "I hadnt planned on getting into the water to day, but this is too good." I even saw a ray in the water, though again I wasnt quick enough on the draw to take a picture. Unfortunately, even eden has a snake. As far as Sandy Cay is concerned, it was midges: tiny no-seeum blood suckers that did not seem to pay attention to our insect repellent. Despite that, we spent an hour there, watching as a day-chartered catamaran, the DayDreamer pulled right up onto the beach, disgorged a full complement of tourists and then pulled away to pick them up later.
From Escape Catamaran 2012

By 10:45, it was time to leave, so we pulled up the anchor and motord over to White Bay, which was an anchorage with very badly marked entrances. Playing it self, we took the western entrance and motord to the mooring buoys and dropped anchor in about 8 feet of water, with the tail of the boat a short jump from shore. I was getting very confident of our anchoring skills, and placed the anchor with no problems whatsoever. After waiting for the boat to settle, we put Shauna and Steves luggage into the dinghy and prepared to motor to Great Harbor where the ferry was. They had declined to go ashore at White Bay, even though I assured them that it was a perfectly fine place to hang out. John had developed a ear infection, so he and Amy were coming along to find the clinic. XiaoQin decided to come along just for the ride.
From Escape Catamaran 2012

With the dinghy so laden, we wallowed over to Great Harbor and dropped everyone off. The return, however, was fast: with just me and XiaoQin in the boat, the dinghy leapt between waves, and it was a much shorter (if much bumpier) ride back. Arturo, Cindy, XiaoQin and I then had a sandwich lunch on the boat while watching the Herons around us dive for fish.
From Escape Catamaran 2012

We had thought that we might have had to eat out tonight, but we had over-provisioned the boat and there would be room for an eclectic dinner for the night. Nevertheless, the day was warming up very nicely, so we took a swim to Ivans Stress Free Bar for a drink and to enjoy the hammock. While lying in the hammock, a mega-yacht pulled up in the harbor, and 10 people with British accents, sat on a bench with various quantities of wine and water, and then pulled away in a large tender, saying "Farewell Caribbean."
From Escape Catamaran 2012

Amy and John arrived while Arturo was lying in the hammock, announcing that they had indeed found the clinic and gotten antibiotics for Johns ear infection. While Arturo, XiaoQin, and I decided to swim over to the Soggy Dollar Bar, Amy and John decided to walk over, since they had walking shoes on.
From Escape Catamaran 2012

The contrast between the two bars was amazing. First of all, the beach was packed with motorboats, many with 1500hp worth of engines, enough to make it to Puerto Rico in very little time. The soggy dollar bar clearly attracted the "Spring Break" crowd. It was a noisy party with people ordering drinks left and right. I ordered some conch fritters since I wanted to know what conch tasted like, and we watched the crowd while waiting for our order, which showed up on island time. On the way back to the boat, I left my waterproof case containing cash and my dive cert, and only realized that after getting onto the Escape. I immediately put on fins and snorkeled and swam back out to the entry point, where I found 4 spring breakers whod obviously picked up my wallet who asked me my name. When I gave it to them, they looked at my id and gave me my money back.
From Escape Catamaran 2012

Amy and John took out the kayak, and Cindy made a potpourri dish with what was left of our supplies. It wasnt a great meal, but it saved us from having to swim to shore for dinner and back. The sunset was gorgeous, and the nearly full moon, when it rose, gave us a gorgeous picture. I was a little sad from it being our last night in the BVIs, but I had made a number of enthusiastic cruising sailors on this trip, which made me feel quite good.

Previous
Next
Read More..

Senin, 03 Februari 2014

Why Doesnt an iPad Charge When Connected to a Computer

An iPhone charges just fine when plugged into a computers USB port. The same goes for an iPod, a pocket camcorder, and any of countless other devices. So whats the iPads problem?

Actually, the difficulty isnt with the iPad, but with the USB port: Its not supplying enough juice. The ports built into most desktops, laptops, and even powered USB hubs dont generate the 10 watts necessary to charge an iPads battery, which is why the ‘Not Charging message appears over the battery indicator. Thats a definite bummer, as it means you cant use, say, your laptop to recharge your iPad on the road. In fact, Apple recommends finding an AC outlet and using the bundled 10-watt power adapter.

That said, lower-powered USB ports will recharge an iPad--just very, very slowly. When the tablets screen is off, the battery does indeed draw a trickle of power. Turn the iPad back on, however, and youll see ‘Not Charging again. (Its kind of like yanking open the refrigerator door to see if the light is still on.)

You should be aware that many PCs pump varying wattages through different USB ports. Frequently the USB ports at the front of the PC are of lower power than the ones in the back. Before giving up, try moving to another port.
Read More..

Sabtu, 01 Februari 2014

How to Add a Second Monitor

Increase your PCs screen spread in five easy steps.


In recent years, wide-screen displays have diminished the need for multiple monitors on your desktop—their extra width provides plenty of space for comparing documents, arranging windows, or viewing particularly unruly spreadsheets. But installing a second monitor can still provide some hard-to-beat freedoms. On some desks, it’s easier to arrange two smaller displays than one very wide one. Also, you can maximize frequently used programs on the second screen, leaving your first monitor clear to the desktop. And many games let you use your second display to show crucial information while the first remains focused on the action.

You’ll marvel at how much having two—or more—monitors can better organize your computing life. Procure a second monitor, and you’re ready to go. (Though it’s not strictly necessary, for best results, your second monitor should have the same diagonal screen size as the first; if they’re both flat panels, a matching native resolution is ideal, too.)


1. Make the hardware hookups

For you to use two monitors, both must be able to physically connect to your computer. Many PCs are built on motherboards with integrated graphics and more than one monitor output; others may have a graphics card installed that has dual monitor connectors. If your computer falls into either category, you’re set—skip to the last paragraph of this step. If not, you first need to upgrade your hardware to support dual displays.

The easiest way is by installing a graphics card with two monitor connectors. Just make sure when shopping for the card that the kind of connectors it has matches those on your monitors. A look at the monitors’ manuals, or at where you plug in the various cords and cables, should give you the information you need. Almost all new cards today come with two DVI ports, or one DVI port and one VGA port, and some provide a DVI-to-VGA adapter in the box. (These adapters are also available separately.) You may also see a DisplayPort connector on some modern cards, but this connection type isn’t widely used yet.

Installing the new card is simple. Turn off your computer, unplug the power cable, and open the case. You’re looking for the PCI Express (PCIe) x16 slot, the longest slot on the motherboard. (At the slot’s inward-facing end, you should see a small release lever.) Assuming there isn’t a card in the slot already, free up the slot by removing the metal spacer where the slot intersects with the PC’s rear panel. You may have to unscrew the spacer with a Phillips screwdriver, though some cases employ restraining clips instead. (Note: Some extra-wide video cards require you to remove two spacers rather than just one.)

Once you have a slot open, align the back edge of the video card (the part with the monitor connectors) with the open space and carefully push the card into the slot until it’s secure and evenly seated. If it’s not fully inserted, you could run into problems when you turn your computer on again.

Some video cards draw all the power they need through the PCIe expansion slot, but others—especially high-end models—need a direct feed from the power supply, too. If your card does, find a free six- or eight-pin connector from the power supply, and connect it to the appropriate jack on your card, usually located along the card’s innermost edge. Next, screw the card into the slot (or, if you have a tool-free case, secure it using the case’s mechanism), close up the system, and replace the power cable you unplugged earlier.

Connect both of your monitors to the video connectors on your computer and/or video card, and to power outlets. Turn them on, then boot up your computer.

2. Install drivers, if needed

If, when booting up, you notice that your computer is displaying the same image on both monitors, then you’ve done everything right. If you installed a graphics card in the previous step, you’ll need to install the appropriate drivers for it. Chances are the card came with a disc that contains the software you need; pop it into the optical drive and follow the instructions. Alternately, you can ensure that you have the most recent drivers by downloading the latest versions from the video chipset’s manufacturer; that’s probably either ATI or Nvidia. If you go this route, simply double-click the file you downloaded, and the software should guide you through installation.

3. Set up your second monitor

Once your second monitor is connected, you need to instruct Windows how to recognize it. If your computer is running Windows Vista, right-click on the desktop, then click on Display Settings in the “Personalize appearance and sound” window that comes up. If you’re using Windows XP, right-click on the desktop to bring up the Display Properties window, then click on the Settings tab.

In either version of Windows, once you’re in the appropriate screen, you need to tell Windows not only how to see the second monitor, but what to do with it. On the visual representation of your setup, one monitor will be big, highlighted, and labeled “1” (this is your main monitor); your secondary monitor will appear smaller and dark. Right-click the secondary display and select “Attached” to activate it, then click the checkbox that says “Extend my Windows desktop onto this monitor.”

Use the slider to adjust the second screen’s resolution. Under most circumstances, having each monitor set to its native resolution will minimize disorientation when moving or looking from one to another. (Also remember that LCDs tend to look their best at their native resolutions.) Even so, you can experiment with different resolution combinations until you find one to your liking.

Once you’re done, click Apply. Your second monitor is now ready to use.

4. Adjust your monitor layout

This step is optional, but if you don’t want to have to move your cursor onto the second monitor by navigating off the upper-right edge of the screen, as the second monitor’s default position dictates, you’ll need to change its virtual positioning.

From the same window in which you set up the monitor in Step 3, click on your secondary monitor and drag it to its new position. You’ll be informed of the exact pixel location of the second monitor as you move it, and it will “snap” to the nearest edge of the first monitor to automatically create an orderly layout (though you can stagger the two displays messily, if you so desire).

You can “move” the monitor to a position above, below, or to the left or right of your current monitor. For obvious reasons, we recommend having the physical monitor in the same position as its virtual counterpart, but this isn’t essential if another setup works better for you.

5. Add even more monitors?

Today’s cutting-edge graphics cards make it possible for systems using them to run three, four, or even more monitors. If your PC has two or more graphics cards set up in Nvidia’s Scalable Link Interface (SLI) or ATI’s CrossFireX configuration, and each of those cards has multiple video outputs, you could be well on your way to a wall of displays. You’ll need enough video jacks, cables, and electricity to keep them all running, but you would set them up the same way you did your second monitor.

Just make sure to arrange them sensibly so you’ll always know where you’re working—it can be easy to get lost between screens. But once you start working with more than one monitor, you’ll wonder how you did without them for so long.
Read More..