Feature hashing

In machine learning, feature hashing, also known as the hashing trick (by analogy to the kernel trick), is a fast and space-efficient way of vectorizing features, i.e. turning arbitrary features into indices in a vector or matrix.[1][2] It works by applying a hash function to the features and using their hash values as indices directly (after a modulo operation), rather than looking the indices up in an associative array. In addition to its use for encoding non-numeric values, feature hashing can also be used for dimensionality reduction.[2]

This trick is often attributed to Weinberger et al. (2009),[2] but there exists a much earlier description of this method published by John Moody in 1989.[1]

Motivation

Motivating example

In a typical document classification task, the input to the machine learning algorithm (both during learning and classification) is free text. From this, a bag of words (BOW) representation is constructed: the individual tokens are extracted and counted, and each distinct token in the training set defines a feature (independent variable) of each of the documents in both the training and test sets.

Machine learning algorithms, however, are typically defined in terms of numerical vectors. Therefore, the bags of words for a set of documents is regarded as a term-document matrix where each row is a single document, and each column is a single feature/word; the entry i, j in such a matrix captures the frequency (or weight) of the j'th term of the vocabulary in document i. (An alternative convention swaps the rows and columns of the matrix, but this difference is immaterial.) Typically, these vectors are extremely sparse—according to Zipf's law.

The common approach is to construct, at learning time or prior to that, a dictionary representation of the vocabulary of the training set, and use that to map words to indices. Hash tables and tries are common candidates for dictionary implementation. E.g., the three documents

  • John likes to watch movies.
  • Mary likes movies too.
  • John also likes football.

can be converted, using the dictionary

Term Index
John 1
likes 2
to 3
watch 4
movies 5
Mary 6
too 7
also 8
football 9

to the term-document matrix

(Punctuation was removed, as is usual in document classification and clustering.)

The problem with this process is that such dictionaries take up a large amount of storage space and grow in size as the training set grows.[3] On the contrary, if the vocabulary is kept fixed and not increased with a growing training set, an adversary may try to invent new words or misspellings that are not in the stored vocabulary so as to circumvent a machine learned filter. To address this challenge, Yahoo! Research attempted to use feature hashing for their spam filters.[4]

Note that the hashing trick isn't limited to text classification and similar tasks at the document level, but can be applied to any problem that involves large (perhaps unbounded) numbers of features.

Mathematical motivation

Mathematically, a token is an element in a finite (or countably infinite) set . Suppose we only need to process a finite corpus, then we can put all tokens appearing in the corpus into , meaning that is finite. However, suppose we want to process all possible words made of the English letters, then is countably infinite.

Most neural networks can only operate on real vector inputs, so we must construct a "dictionary" function .

When is finite, of size , then we can use one-hot encoding to map it into . First, arbitrarily enumerate , then define . In other words, we assign a unique index to each token, then map the token with index to the unit basis vector .

One-hot encoding is easy to interpret, but it requires one to maintain the arbitrary enumeration of . Given a token , to compute , we must find out the index of the token . Thus, to implement efficiently, we need a fast-to-compute bijection , then we have .

In fact, we can relax the requirement slightly: It suffices to have a fast-to-compute injection , then use .

In practice, there is no simple way to construct an efficient injection . However, we do not need a strict injection, but only an approximate injection. That is, when , we should probably have , so that probably .

At this point, we have just specified that should be a hashing function. Thus we reach the idea of feature hashing.

Algorithms

Feature hashing (Weinberger et al. 2009)

The basic feature hashing algorithm presented in (Weinberger et al. 2009)[2] is defined as follows.

First, one specifies two hash functions: the kernel hash , and the sign hash . Next, one defines the feature hashing function:Finally, extend this feature hashing function to strings of tokens bywhere is the set of all finite strings consisting of tokens in .

Equivalently,

Geometric properties

We want to say something about the geometric property of , but , by itself, is just a set of tokens, we cannot impose a geometric structure on it except the discrete topology, which is generated by the discrete metric. To make it nicer, we lift it to , and lift from to by linear extension: There is an infinite sum there, which must be handled at once. There are essentially only two ways to handle infinities. One may impose a metric, then take its completion, to allow well-behaved infinite sums, or one may demand that nothing is actually infinite, only potentially so. Here, we go for the potential-infinity way, by restricting to contain only vectors with finite support: , only finitely many entries of are nonzero.

Define an inner product on in the obvious way: As a side note, if is infinite, then the inner product space is not complete. Taking its completion would get us to a Hilbert space, which allows well-behaved infinite sums.


Now we have an inner product space, with enough structure to describe the geometry of the feature hashing function .

First, we can see why is called a "kernel hash": it allows us to define a kernel byIn the language of the "kernel trick", is the kernel generated by the "feature map" Note that this is not the feature map we were using, which is . In fact, we have been using another kernel , defined by The benefit of augmenting the kernel hash with the binary hash is the following theorem, which states that is an isometry "on average".

Theorem (intuitively stated) — If the binary hash is unbiased (meaning that it takes value with equal probability), then is an isometry in expectation:

Proof

By linearity of expectation, Now, , since we assumed is unbiased. So we continue


The above statement and proof interprets the binary hash function not as a deterministic function of type , but as a random binary vector with unbiased entries, meaning that for any .

This is a good intuitive picture, though not rigorous. For a rigorous statement and proof, see [2]

Pseudocode implementation

Instead of maintaining a dictionary, a feature vectorizer that uses the hashing trick can build a vector of a pre-defined length by applying a hash function h to the features (e.g., words), then using the hash values directly as feature indices and updating the resulting vector at those indices. Here, we assume that feature actually means feature vector.

 function hashing_vectorizer(features : array of string, N : integer):
     x := new vector[N]
     for f in features:
         h := hash(f)
         x[h mod N] += 1
     return x

Thus, if our feature vector is ["cat","dog","cat"] and hash function is if is "cat" and if is "dog". Let us take the output feature vector dimension (N) to be 4. Then output x will be [0,2,1,0]. It has been suggested that a second, single-bit output hash function ξ be used to determine the sign of the update value, to counter the effect of hash collisions.[2] If such a hash function is used, the algorithm becomes

 function hashing_vectorizer(features : array of string, N : integer):
     x := new vector[N]
     for f in features:
         h := hash(f)
         idx := h mod N
         if ξ(f) == 1:
             x[idx] += 1
         else:
             x[idx] -= 1
     return x

The above pseudocode actually converts each sample into a vector. An optimized version would instead only generate a stream of pairs and let the learning and prediction algorithms consume such streams; a linear model can then be implemented as a single hash table representing the coefficient vector.

Extensions and variations

Learned feature hashing

Feature hashing generally suffers from hash collision, which means that there exist pairs of different tokens with the same hash: . A machine learning model trained on feature-hashed words would then have difficulty distinguishing and , essentially because is polysemic.

If is rare, then performance degradation is small, as the model could always just ignore the rare case, and pretend all means . However, if both are common, then the degradation can be serious.

To handle this, one can train supervised hashing functions that avoids mapping common tokens to the same feature vectors.[5]

Applications and practical performance

Ganchev and Dredze showed that in text classification applications with random hash functions and several tens of thousands of columns in the output vectors, feature hashing need not have an adverse effect on classification performance, even without the signed hash function.[3]

Weinberger et al. (2009) applied their version of feature hashing to multi-task learning, and in particular, spam filtering, where the input features are pairs (user, feature) so that a single parameter vector captured per-user spam filters as well as a global filter for several hundred thousand users, and found that the accuracy of the filter went up.[2]

Chen et al. (2015) combined the idea of feature hashing and sparse matrix to construct "virtual matrices": large matrices with small storage requirements. The idea is to treat a matrix as a dictionary, with keys in , and values in . Then, as usual in hashed dictionaries, one can use a hash function , and thus represent a matrix as a vector in , no matter how big is. With virtual matrices, they constructed HashedNets, which are large neural networks taking only small amounts of storage.[6]

Implementations

Implementations of the hashing trick are present in:

See also

References

  1. ^ a b Moody, John (1989). "Fast learning in multi-resolution hierarchies" (PDF). Advances in Neural Information Processing Systems.
  2. ^ a b c d e f g Kilian Weinberger; Anirban Dasgupta; John Langford; Alex Smola; Josh Attenberg (2009). Feature Hashing for Large Scale Multitask Learning (PDF). Proc. ICML.
  3. ^ a b K. Ganchev; M. Dredze (2008). Small statistical models by random feature mixing (PDF). Proc. ACL08 HLT Workshop on Mobile Language Processing.
  4. ^ Josh Attenberg; Kilian Weinberger; Alex Smola; Anirban Dasgupta; Martin Zinkevich (2009). "Collaborative spam filtering with the hashing trick". Virus Bulletin.
  5. ^ Bai, B.; Weston J.; Grangier D.; Collobert R.; Sadamasa K.; Qi Y.; Chapelle O.; Weinberger K. (2009). Supervised semantic indexing (PDF). CIKM. pp. 187–196.
  6. ^ Chen, Wenlin; Wilson, James; Tyree, Stephen; Weinberger, Kilian; Chen, Yixin (2015-06-01). "Compressing Neural Networks with the Hashing Trick". International Conference on Machine Learning. PMLR: 2285–2294. arXiv:1504.04788.
  7. ^ Owen, Sean; Anil, Robin; Dunning, Ted; Friedman, Ellen (2012). Mahout in Action. Manning. pp. 261–265.
  8. ^ "gensim: corpora.hashdictionary – Construct word<->id mappings". Radimrehurek.com. Retrieved 2014-02-13.
  9. ^ "4.1. Feature extraction — scikit-learn 0.14 documentation". Scikit-learn.org. Retrieved 2014-02-13.
  10. ^ "sofia-ml - Suite of Fast Incremental Algorithms for Machine Learning. Includes methods for learning classification and ranking models, using Pegasos SVM, SGD-SVM, ROMMA, Passive-Aggressive Perceptron, Perceptron with Margins, and Logistic Regression". Retrieved 2014-02-13.
  11. ^ "Hashing TF". Retrieved 4 September 2015. Maps a sequence of terms to their term frequencies using the hashing trick.
  12. ^ "FeatureHashing: Creates a Model Matrix via Feature Hashing With a Formula Interface". 10 January 2024.
  13. ^ "tf.keras.preprocessing.text.hashing_trick — TensorFlow Core v2.0.1". Retrieved 2020-04-29. Converts a text to a sequence of indexes in a fixed-size hashing space.
  14. ^ "dask_ml.feature_extraction.text.HashingVectorizer — dask-ml 2021.11.17 documentation". ml.dask.org. Retrieved 2021-11-22.

Read other articles:

Artikel ini tidak memiliki referensi atau sumber tepercaya sehingga isinya tidak bisa dipastikan. Tolong bantu perbaiki artikel ini dengan menambahkan referensi yang layak. Tulisan tanpa sumber dapat dipertanyakan dan dihapus sewaktu-waktu.Cari sumber: Investor Daily – berita · surat kabar · buku · cendekiawan · JSTOR Untuk saluran televisi, lihat IDTV. Investor DailyInspirasi Dalam InvestasiHalaman depan Investor Daily tanggal 26 Mei 2016TipeSurat kab...

 

Australian rower Phil CayzerPersonal informationBorn13 May 1922Sydney, New South WalesDied15 July 2015(2015-07-15) (aged 93)Sydney, New South WalesEducationSt Joseph's College, Hunters HillSpouseMelva CayzerSportClubSydney Rowing Club Mercantile Rowing Club Medal record Men's rowing Olympic Games 1952 Helsinki Men's eight British Empire Games 1950 Auckland Men's eight Philip Arthur Cayzer OAM, (13 May 1922 – 15 July 2015) was an Australian national champion rower who won medals in the ...

 

Artikel ini sebatang kara, artinya tidak ada artikel lain yang memiliki pranala balik ke halaman ini.Bantulah menambah pranala ke artikel ini dari artikel yang berhubungan atau coba peralatan pencari pranala.Tag ini diberikan pada Desember 2023. Penghargaan dan Nominasi Rossa Penghargaan dan Nominasi Penghargaan Menang Nominasi Anugerah Musik Indonesia 6 32 Piala Maya 0 2 Jumlah Menang 6 Nominasi 34 Penyanyi asal Indonesia, Rossa, telah menerima beberapa penghargaan sepanjang kariernya, seper...

Cet article est une ébauche concernant le Concours Eurovision de la chanson et l’Allemagne. Vous pouvez partager vos connaissances en l’améliorant (comment ?) ; pour plus d’indications, visitez le projet Eurovision. Allemagneau Concours Eurovision 1979 Données clés Pays  Allemagne Chanson Dschinghis Khan Interprète Dschinghis Khan Langue Allemand Sélection nationale Radiodiffuseur BR Type de sélection Finale nationaleÉmission télévisée : Ein Lied für Jer...

 

This biography of a living person relies too much on references to primary sources. Please help by adding secondary or tertiary sources. Contentious material about living persons that is unsourced or poorly sourced must be removed immediately, especially if potentially libelous or harmful.Find sources: Stephen Ives – news · newspapers · books · scholar · JSTOR (March 2012) (Learn how and when to remove this template message) Stephen IvesBornStephen Iv...

 

فورنوي   تقسيم إداري البلد اليونان  [1] إحداثيات 37°35′27″N 26°30′08″E / 37.59078889°N 26.50223611°E / 37.59078889; 26.50223611   السكان التعداد السكاني 1080 (resident population of Greece) (2001)1114 (resident population of Greece) (1991)898 (resident population of Greece) (2021)1120 (resident population of Greece) (2011)  معلومات أخرى الرمز البريدي ...

For other stations that have used the KFMT callsign, see KFMT (disambiguation). Radio station in Fremont, NebraskaKFMT-FMFremont, NebraskaBroadcast areaFremont, NebraskaFrequency105.5 MHzBrandingMix 105.5ProgrammingFormatAdult contemporaryOwnershipOwnerSteven W. Seline(Walnut Radio, LLC)Sister stationsKHUBHistoryFirst air date1987Call sign meaningFreMonTTechnical informationFacility ID34549ClassAERP1,200 wattsHAAT137 meters (449 ft)Transmitter coordinates41°24′40.00″N 96°31′53.00...

 

American TV series or program Here and NowGenre Dark comedy[1] Drama Created byAlan BallStarring Holly Hunter Tim Robbins Daniel Zovatto Jerrika Hinton Raymond Lee Sosie Bacon Joe Williamson Andy Bean Peter Macdissi ComposerMichael PennCountry of originUnited StatesOriginal languageEnglishNo. of seasons1No. of episodes10 (list of episodes)ProductionExecutive producers Alan Ball Peter Macdissi David Knoller ProducerSteve OsterRunning time52-59 minutesProduction companies Your Face Goe...

 

Big Fish - Le storie di una vita incredibileUna scena del filmTitolo originaleBig Fish Lingua originaleinglese Paese di produzioneStati Uniti d'America Anno2003 Durata125 min Generefantastico, avventura, drammatico RegiaTim Burton SoggettoDaniel Wallace (romanzo) SceneggiaturaJohn August ProduttoreRichard D. Zanuck, Bruce Cohen, Dan Jinks Produttore esecutivoArne Schmidt Casa di produzioneColumbia Pictures, Jinks/Cohen Company, The Zanuck Company, Tim Burton Productions Distribuzione ...

Pour l’article homonyme, voir Jacob Merlo Horstius. Jacob HorstBiographieNaissance 1537 ou 1er mai 1537TorgauDécès 1600 ou 21 mai 1600HelmstedtActivités Médecin, chirurgien-dentisteAutres informationsA travaillé pour Université d'Helmstedtmodifier - modifier le code - modifier Wikidata Jacob Horstius ou Jacob Horst (1537-1600), était un professeur de médecine à l'Université de Helmstedt, qui publia séparément deux ouvrages chez le même éditeur en 1593. Le premier travail est ...

 

此條目可参照英語維基百科相應條目来扩充。 (2021年5月6日)若您熟悉来源语言和主题,请协助参考外语维基百科扩充条目。请勿直接提交机械翻译,也不要翻译不可靠、低品质内容。依版权协议,译文需在编辑摘要注明来源,或于讨论页顶部标记{{Translated page}}标签。 约翰斯顿环礁Kalama Atoll 美國本土外小島嶼 Johnston Atoll 旗幟颂歌:《星條旗》The Star-Spangled Banner約翰斯頓環礁�...

 

周處除三害The Pig, The Snake and The Pigeon正式版海報基本资料导演黃精甫监制李烈黃江豐動作指導洪昰顥编剧黃精甫主演阮經天袁富華陳以文王淨李李仁謝瓊煖配乐盧律銘林孝親林思妤保卜摄影王金城剪辑黃精甫林雍益制片商一種態度電影股份有限公司片长134分鐘产地 臺灣语言國語粵語台語上映及发行上映日期 2023年10月6日 (2023-10-06)(台灣) 2023年11月2日 (2023-11-02)(香�...

1995 IAAF WorldIndoor ChampionshipsTrack events60 mmenwomen200 mmenwomen400 mmenwomen800 mmenwomen1500 mmenwomen3000 mmenwomen60 m hurdlesmenwomen4 × 400 m relaymenwomenField eventsHigh jumpmenwomenPole vaultmenLong jumpmenwomenTriple jumpmenwomenShot putmenwomenCombined eventsPentathlonwomenHeptathlonmenvte The men's 3000 metres event at the 1995 IAAF World Indoor Championships was held on 10–12 March.[1] Medalists Gold Silver Bronze Gennaro Di Napoli Italy Anacleto Jim�...

 

Untuk kegunaan lain, lihat Bunglon (disambiguasi). Artikel ini bukan mengenai Kelelesa. Lihat entri bunglon di kamus bebas Wiktionary. Bunglon Calotes Bunglon taman TaksonomiKerajaanAnimaliaFilumChordataKelasReptiliaOrdoSquamataFamiliAgamidaeGenusCalotes Cuvier, 1816 Tata namaSinonim taksonOriocalotes lbs Bunglon[1] (Calotes) adalah sebutan khusus untuk beraneka jenis kadal/bengkarung yang memiliki kemampuan mengubah warna kulitnya.[2] Secara umum, istilah bunglon digunakan un...

 

This article is about the San Francisco anarchist magazine. For other magazines, see Blast. The BlastCategories Political philosophy Activism Anarchism FrequencySemi-monthlyPublisherAlexander BerkmanFirst issueJanuary 15, 1916Final issueNumberJune 1, 1917Vol. 2 No. 5CountryUnited StatesBased inSan Francisco, CaliforniaLanguageEnglish The Blast was a semi-monthly anarchist periodical published by Alexander Berkman in San Francisco, California, USA from 1916 through 1917. The publication had ro...

Cet article est une ébauche concernant une personnalité américaine. Vous pouvez partager vos connaissances en l’améliorant (comment ?) selon les recommandations des projets correspondants. Pour les articles homonymes, voir James Smith et Smith. James SmithBiographieNaissance 17 septembre 1719UlsterDécès 11 juillet 1806 (à 86 ans)YorkNationalités irlandaiseaméricaineFormation Université de PennsylvanieUniversité du DelawareActivités Écrivain, homme politiqueSignaturem...

 

Professional Fighters League MMA event in 2024 PFL Europe 3The poster for PFL Europe 3InformationPromotionProfessional Fighters LeagueDateSeptember 28, 2024 (2024-September-28)VenueOVO HydroCityGlasgow, ScotlandEvent chronology PFL 9 PFL Europe 3 PFL Super Fights: Battle of the Giants Main article: 2024 in Professional Fighters League PFL Europe 3 is an upcoming mixed martial arts event produced by the Professional Fighters League that will take place on September 28, 2024, at ...

 

This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed.Find sources: Fianarantsoa-Côte Est railway – news · newspapers · books · scholar · JSTOR (February 2010) (Learn how and when to remove this message) Fianarantsoa–Côte EstAn FCE train at Manampatrana.OverviewStatusOpenOwnerFianarantsoa–Côte EstLocaleHaute Matsiatra / ...

Serie ASkuad Juventus musim 1933–34Musim1933–34JuaraJuventusGelar juara ke-6DegradasiPadovaGenova 1893CasaleJumlah pertandingan306Jumlah gol844 (2,76 per pertandingan)Pencetak golterbanyakFelice Borel(31 gol)← 1932–33 1934–35 → Serie A musim 1933–1934 dimenangkan oleh Juventus. Tim Livorno dan Brescia promosi dari Serie B. Kejadian penting Jumlah tim yang mengalami degradasi ditambahkan untuk sementara demi mengurangi jumlah peserta liga. Klasemen akhir Pos Tim Main M S K M...

 

City in Balochistan, Pakistan For other places with the same name, see Chaman (disambiguation).City in Balochistan, PakistanChaman ‏‎ ‎چمن‎ CityChaman Gate border between Pakistan and AfghanistanChamanShow map of Balochistan, PakistanChamanShow map of PakistanChamanShow map of AsiaChamanShow map of EarthCoordinates: 30°55′20″N 66°26′41″E / 30.92222°N 66.44472°E / 30.92222; 66.44472CountryPakistanProvinceBalochistanDistrictChaman D...