Make your own eBooks

Use our Remix App to mix & match content. In minutes make your own course packs, training bundles, custom travel guides, you name it. Even add your own title & cover.


Slices & Articles Get by the slice or add to your own ebook

Medium 9781576755648

1 Introduction: The Hole We’re In and How We Can Stop Digging

Slice ePub May 15, 2014

The first rule for being in a hole that you can’t climb out of: Stop digging!

DENIS HEALEY, former British Chancellor of the Exchequer

What if training really had to work? What if your organization was “betting the business” on a new strategic venture, the success of which depended largely on training? Could you guarantee that the training would absolutely, positively work to drive performance and to create business impact? The odds would be against you. The reality is that training fails to work far more often than it works. If you put a hundred 2employees through the typical corporate training program, chances are that less than 20% will end up using what they learned in ways that will lead to improved job performance. The vast majority of trainees will fail to improve their performance, even if they tried to utilize the training. They will encounter a combination of obstacles, including indifferent bosses, crushing time pressures, lack of incentives to change, peer pressure, or some other problem that will extinguish their motivation.

See more

Medium 9780596519407


Source: Ferret
Slice ePub May 28, 2014

A TokenStream takes a field and turns it into a list of tokens. To implement a TokenStream, you need to supply two methods. TokenStream#next should return Tokens in the order followed by nil when there are no more tokens left in the field. TokenStream#text= is used to set the text that the TokenStream will analyze. In Ferret, there are two types of TokenStreams: Tokenizers and TokenFilters.

In the next two sections, well make use of the following test code to test each TokenStream, printing the tokens in a table:

Tokenizers take the raw text data from a field and turn it into a list of Tokens. Ferret comes with a number of tokenizer implementations, including:

WhiteSpaceTokenizer (and AsciiWhiteSpaceTokenizer)

LetterTokenizer (and AsciiLetterTokenizer)

StandardTokenizer (and AsciiStandardTokenizer)


Where an ASCII tokenizer exists, the non-ASCII tokenizer is locale-sensitive. That means that the tokenizer will recognize letters, numbers, and whitespace as specified by your locale. If your locale is set to UTF-8, then the tokenizer will recognize UTF-8 characters. This means you need to make sure that the data you are feeding Ferret is in the correct encoding according to your locale; otherwise, you could wind up running into some strange errors. The ASCII tokenizers tend to be more robust and a little faster than the locale-sensitive tokenizers, so if your data is ASCII, you should definitely use an ASCII analyzer.

See more

Medium 9781616139780


Slice PDF May 13, 2014

Medium 9781449367930

4. Functions

Slice ePub January 27, 2015

Functions are the core building blocks of reusable logic. Of course, you probably already knew that, because nearly all other languages also have functions (or methods, the object-oriented version of functions). Devoting an entire chapter to a concept common across languages may thus seem odd, but to Scala and other functional programming languages functions are very important.

Functional programming languages are geared to support the creation of highly reusable and composable functions and to help developers organize their code base around them. Much like a Unix power user will compose multiple single-purpose tools into a complex piped command, a functional programmer will combine single-purpose function invocations into chains of operations (think Map/Reduce). A function that was written with a simple purpose (e.g., to double a number) may be picked up and applied across a 50,000-node list, or given to an actor to be executed locally or in a remote server.

In Scala, functions are named, reusable expressions. They may be parameterized and they may return a value, but neither of these features are required. These features are, however, useful for ensuring maximum reusability and composability. They will also help you write shorter, more readable, and more stable applications. Using parameterized functions you can normalize duplicated code, simplifying your logic and making it more discoverable. Testing your code becomes easier, because normalized and parameterized logic is easier to test than denormalized logic repeated throughout your code.

See more

Medium 9781857549003


Source: The Face of It
Slice PDF February 05, 2015

See All Slices

0 Items 0.0 Mb
Sub-total $0.00
or drag to add content