Dataset Viewer
Auto-converted to Parquet Duplicate
tweet_id
stringlengths
10
19
full_text
stringlengths
16
359
expanded_url
stringlengths
43
52
embeddings
sequencelengths
256
256
1865721226620236176
Making MLX run on Windows, natively, not WSL or MinGW. https://t.co/qOn9phNf7e
https://twitter.com/i/web/status/1865721226620236176
[ 0.5612192153930664, -1.3742122650146484, -1.7414368391036987, -1.8779406547546387, 0.7156270146369934, 1.4436615705490112, 0.9565522074699402, -0.37011098861694336, 0.5542045831680298, 0.8383215665817261, -1.249984622001648, 0.14407220482826233, -1.1919058561325073, 0.36918655037879944, ...
1865765748209062099
@zcbenz Interesting! It does run, because I usually convert big models on Linux servers. But there is some digging to do for full support Great thing is that MLX is built using Cpp so it’s possible.
https://twitter.com/i/web/status/1865765748209062099
[ 0.3261999487876892, 1.2048708200454712, -1.0372496843338013, -2.4511358737945557, 1.3667168617248535, 2.3936307430267334, -0.5775288343429565, -2.0375192165374756, -0.2729433476924896, 0.13118717074394226, -0.4765159785747528, 0.6047537326812744, 0.08176794648170471, 0.33885231614112854, ...
1865482724351287420
If you have a Macbook Pro M series with at least 64 GB's of RAM you can now run a GPT-4 level LLM locally! 1. Install @ollama 2. Open your terminal and run ollama pull llama3.3 3. Then ollama run llama3.3 "your prompt" Your own personal AI is here! https://t.co/jakuVlMteE
https://twitter.com/i/web/status/1865482724351287420
[ 0.6010013222694397, -0.7031824588775635, -1.763893723487854, -1.525599718093872, 1.0397119522094727, 1.0617501735687256, 0.33449459075927734, -0.06600143015384674, -0.025183215737342834, 0.19507016241550446, -1.8157137632369995, -0.8436480760574341, -2.0065338611602783, -1.1084907054901123...
1865081419015352689
Gemini-exp-1206, our latest Gemini iteration, (with the full 2M token context and much more) is available right now for free in Google AI Studio and the Gemini API. I hope you have enjoyed year 1 of the Gemini era as much as I have. We are just getting started : )
https://twitter.com/i/web/status/1865081419015352689
[ -0.12989267706871033, 0.4693562686443329, -2.1581103801727295, -0.6392908692359924, 0.9318581223487854, 2.75298810005188, 0.34011030197143555, -0.3327130377292633, 1.3625649213790894, 1.5997809171676636, -1.8230633735656738, -1.373061180114746, -1.2840580940246582, 0.574829638004303, -0....
1865441145788199051
Pro Tip: if you upgrade from ChatGPT Plus to Pro near the end of your billing cycle you only pay a percentage of $200 for the remaining days. So you can try unlimited o1 for much less.
https://twitter.com/i/web/status/1865441145788199051
[ -1.6345093250274658, 3.0760347843170166, -1.6222363710403442, -3.0461575984954834, 1.0703048706054688, 1.5225337743759155, 1.4533432722091675, -0.5846421718597412, 1.2168066501617432, 0.6220940947532654, -1.2438212633132935, -1.1582493782043457, 0.18684372305870056, -1.9802707433700562, ...
1865632091427520591
nbsanity now has a bookmarklet https://t.co/VSIrhDXrVS Thanks to @OAustegard It's a static server that renders public Jupyter notebooks with Quarto
https://twitter.com/i/web/status/1865632091427520591
[ 1.6696345806121826, -0.7492926716804504, -3.3341684341430664, -2.843214988708496, 0.6583613753318787, 1.5686211585998535, 1.338955283164978, -0.5099698901176453, -2.402035713195801, -0.1417120099067688, -1.0858708620071411, -0.5225531458854675, -0.7961291670799255, -0.735790491104126, 0....
1865534628691677683
Thanks @_akhaliq! Florence-VL is the first large multimodal model equiped with the popular Florence-v2 vision encoder. The combination is NOT deliberate, but originated from a deep study based on our proposed visual-semantic alignment metric for different vision encoders and… https://t.co/xSaOgDNfCP https://t.co/0McPg...
https://twitter.com/i/web/status/1865534628691677683
[ 1.3361756801605225, 0.80546635389328, -3.058751344680786, -2.517049789428711, 1.0407170057296753, 0.7701636552810669, 0.897321343421936, 1.0834769010543823, -0.548965334892273, 0.7078219652175903, -0.8652453422546387, 0.1226915493607521, -0.7850121259689331, -0.05312849208712578, 0.79897...
1865515272032882723
“we just announced that we are building a 2 gigawatt+ data center in louisiana that we are going to use to train future versions of LLAMA” that’s the same power consumed by 1.6 million homes, roughly as many as there are in georgia https://t.co/TZYXMSe12Q
https://twitter.com/i/web/status/1865515272032882723
[ 0.21270334720611572, 0.8135819435119629, -0.4223920702934265, -0.45993760228157043, 0.8071768283843994, 1.9780195951461792, 0.6146073937416077, -1.0449175834655762, 0.23583240807056427, 1.1943066120147705, 0.2898847460746765, 0.06064888462424278, -1.2217674255371094, -1.6382408142089844, ...
1865522970241708391
https://t.co/gsnXHNxFGA
https://twitter.com/i/web/status/1865522970241708391
[ 5.135742664337158, -2.3667564392089844, -3.507371664047241, -4.431793212890625, 1.0651887655258179, -1.5854073762893677, 1.7353678941726685, -0.3281850814819336, 1.751423716545105, -2.9655418395996094, 0.43927571177482605, 1.7863025665283203, -0.20935556292533875, -0.7352243065834045, 0....
1865522967041433959
Instructed Sonnet to generate a nice readme for my perplexity-search repo. It now looks like a proper project even though it's just making an API call https://t.co/cn6Ms593y7
https://twitter.com/i/web/status/1865522967041433959
[ -0.9418262243270874, 0.9369025230407715, -3.8023552894592285, -1.6584885120391846, 0.39898064732551575, 0.7793837189674377, -0.3017638623714447, -0.8334254026412964, 0.37571975588798523, 0.525952160358429, -2.2959601879119873, -1.10304856300354, -1.2160450220108032, 0.10687042027711868, ...
1865444655342506155
Running @exolabs AI cluster with @Raspberry_Pi 4. exo leverages any spare hardware you have available and puts it to use on AI workloads by splitting them across all devices, You know you've built something magical when you're continually surprised by what people do with it. https://t.co/3Li0Cu6sB7 https://t.co/3xuon...
https://twitter.com/i/web/status/1865444655342506155
[ -0.2046113759279251, 0.013560152612626553, -1.6834030151367188, -2.423550844192505, 1.0437549352645874, 0.8605743646621704, 0.1702089160680771, 0.17842787504196167, -0.2965546250343323, 0.7136004567146301, -0.8219081163406372, -0.7049912810325623, -1.514407753944397, 0.09238770604133606, ...
1865589927540342801
https://t.co/u6pljatcvL
https://twitter.com/i/web/status/1865589927540342801
[ 1.916953206062317, 0.309400349855423, -2.177018404006958, -3.3140716552734375, -1.403745412826538, 1.069481372833252, 0.21788063645362854, 2.750255823135376, -2.210149049758911, 0.313692569732666, -1.3859862089157104, -2.5216517448425293, -3.0636069774627686, -1.3063502311706543, -1.1236...
1865589925170528646
"Take control of your AI agents" https://t.co/ohYy7ymC7l
https://twitter.com/i/web/status/1865589925170528646
[ -0.9897646903991699, -1.0810028314590454, -2.733896017074585, -0.9037990570068359, 0.29021933674812317, -0.38527727127075195, -1.931527018547058, 1.922024130821228, 1.0635361671447754, 0.7301600575447083, -1.3413044214248657, -0.2074512243270874, -2.0266358852386475, -1.1959940195083618, ...
1865305243048718519
Sinterklaas kwam een dag te laat langs. Op zijn mijter stond in gouden letters ‘De Standaard” geborduurd. Op zijn staf kon je ook nog in kronkelende kapitalen ‘De Letteren’ lezen. De goede man haalde onderstaand artikel uit zijn zak. https://t.co/2MUIlqKmoy
https://twitter.com/i/web/status/1865305243048718519
[ 0.6869345307350159, -3.7449984550476074, -4.150768280029297, -3.087956666946411, 1.9076157808303833, 0.05204140767455101, 2.476619243621826, 3.5545766353607178, -1.893348217010498, 0.20842379331588745, 0.6738370060920715, 0.07491686940193176, -0.4045279920101166, 0.81241774559021, 1.1795...
1865419965190328488
currently having my mind blown reading this book in a coffee shop. seriously wtf, wish i found this sooner. https://t.co/ZAfmw820d9
https://twitter.com/i/web/status/1865419965190328488
[ -3.5176889896392822, -1.0426421165466309, -0.9501897692680359, -0.7234265804290771, 0.8320796489715576, 3.4448158740997314, 1.3029401302337646, -0.620251476764679, 1.4819210767745972, 1.9641950130462646, 0.3239336609840393, -0.1714840680360794, -0.774284839630127, 0.46205654740333557, 0....
1865010068204486915
Florence-VL challenges the status quo in Vision-Language Models! Even Google's PaliGemma 2 dropped yesterday (still using SigLIP), Florence-VL shows us there might be a better way! Here's why this matters 👇 📸 Look at this revealing visualization: Left: Three test images… https://t.co/IkBTsZSNBW
https://twitter.com/i/web/status/1865010068204486915
[ -0.6268230676651001, -0.16394264996051788, -2.2458713054656982, -1.3855488300323486, 0.440878689289093, 1.2784671783447266, 0.5558773279190063, -0.8045994639396667, 0.04955052211880684, 0.07992351800203323, -0.183809295296669, -0.6618932485580444, -0.5566936135292053, -0.5757043957710266, ...
1865399992690544866
@willmcgugan You can hit https://t.co/6rsWJpgkNp, then grab response["info"]["version"]
https://twitter.com/i/web/status/1865399992690544866
[ 0.43823763728141785, 0.9097719788551331, -2.019831657409668, -2.493041515350342, 2.854469060897827, 0.6061225533485413, -0.2034129649400711, -0.09327024966478348, -0.2897675037384033, -1.0327593088150024, -1.6945724487304688, 0.7009143829345703, -1.4448578357696533, 0.37474170327186584, ...
1865188121060712479
Command: mlx_lm.generate --model mlx-community/Llama-3.3-70B-Instruct-4bit --max-tokens 256 --prompt "Do avocado trees grow in the bay area?" Thanks to @Prince_Canuma for converting the models. A bunch of different quants (3, 4, 6, 8, etc) are up the MLX Community:…
https://twitter.com/i/web/status/1865188121060712479
[ 1.34731924533844, 0.24823150038719177, -1.8499587774276733, -2.5581798553466797, 2.191244125366211, 1.7003554105758667, -0.4737775921821594, -1.249107837677002, -0.38907381892204285, 0.7452213168144226, -0.90984708070755, -0.05473489686846733, -0.6675773859024048, -0.3025948405265808, 0....
1865199571988611382
🐍📺 Using Data Classes in Python [Video] — https://t.co/cVTmP8wjDR #python https://t.co/pm7AaPnLjY
https://twitter.com/i/web/status/1865199571988611382
[ 1.8302737474441528, 0.5730268359184265, -2.9225635528564453, -2.7021970748901367, 0.016618266701698303, 0.8092389702796936, 0.5643011927604675, 0.8925224542617798, -1.6730509996414185, -1.435531735420227, -2.0370800495147705, 0.8970026969909668, -1.6527938842773438, 1.7432680130004883, -...
1688849233967730688
Some research informed thoughts on adding brief sprints to low-intensity (LIT) sessions: 1. Keep the duration down to 3 to 4 seconds! This is the typical "Alactic" duration. Even extending to 6-8 seconds will result in significant lactate production. If you drop in a block of…
https://twitter.com/i/web/status/1688849233967730688
[ -1.973872423171997, 2.130939483642578, -0.21569158136844635, -2.911151170730591, 0.1612633317708969, 1.0122284889221191, -0.08207631856203079, -2.3450734615325928, -0.7517605423927307, 1.0894562005996704, 0.21033510565757751, 0.2984728515148163, -2.512159585952759, -0.05329269915819168, ...
1865170551368519958
omg that slide took me probably a whole evening lol https://t.co/aTuIFrTD7a
https://twitter.com/i/web/status/1865170551368519958
[ -1.129042625427246, -1.7932120561599731, -2.5504777431488037, -1.6982027292251587, -0.1557493954896927, 0.7028782367706299, 1.5976227521896362, 0.3295997977256775, -0.13192510604858398, 1.5060228109359741, -0.6295570135116577, -0.8269705176353455, -1.814416766166687, -0.6586761474609375, ...
1865203069639709079
"Mastering Applied AI, One Concept at a Time" https://t.co/ByXiy0rJzN
https://twitter.com/i/web/status/1865203069639709079
[ -0.451631098985672, -0.30567657947540283, -3.767235040664673, -3.0008809566497803, 0.9573404788970947, -0.4487035572528839, -0.8095642328262329, 0.16475635766983032, -0.622283399105072, -0.0749366283416748, -0.2562319338321686, 0.38589489459991455, -2.4642322063446045, 0.7463423609733582, ...
1865203078040826254
https://t.co/71G609LHhe
https://twitter.com/i/web/status/1865203078040826254
[ -1.570193886756897, -1.022013783454895, -2.4261319637298584, -1.5650354623794556, 0.8692456483840942, 1.771499752998352, 3.304534912109375, 0.5371256470680237, 0.7225611805915833, -0.6320534348487854, -0.6217443943023682, -1.4915125370025635, -0.41789355874061584, -3.0929741859436035, 0....
1865034787582251340
PaliGemma2 for image to JSON data extraction - used google/paligemma2-3b-pt-336 checkpoint; I tried to make it happen with 224, but 336 performed a lot better - trained on A100 with 40GB VRAM - trained with LoRA colab with complete fine-tuning code: https://t.co/M1lbYXQUg6 https://t.co/DHNHGePaqM
https://twitter.com/i/web/status/1865034787582251340
[ 0.34394383430480957, 0.7778353095054626, -2.895763635635376, -2.5097544193267822, 1.137946367263794, 1.3457980155944824, 0.8381218910217285, 0.3442422151565552, 0.8346719741821289, -0.6113961338996887, -0.7050164341926575, -0.510120689868927, -1.2011185884475708, -1.0977107286453247, 0.2...
1865187697146700273
Llama 3.3 70B 4-bit runs nicely on a 64GB M3 Max with in MLX LM (~10 toks/sec). Would be even faster on an M4 Max. Yesterday's server-only 405B is today's laptop 70B: https://t.co/ssxITH5ggT https://t.co/vgEn2E5WS8
https://twitter.com/i/web/status/1865187697146700273
[ 0.9599223732948303, 0.242132768034935, -1.5308445692062378, -2.3684821128845215, 2.433318853378296, 2.118936538696289, 1.3987523317337036, -1.2276803255081177, 1.3235549926757812, -0.07478565722703934, -1.1558858156204224, 0.1721528172492981, -1.4624801874160767, -0.9021667838096619, 1.0...
1865198113188643197
https://t.co/zGpeO2mJLi
https://twitter.com/i/web/status/1865198113188643197
[ 3.0623300075531006, -0.7782238125801086, -4.676839351654053, -1.9230809211730957, 0.6540589928627014, 0.11029358208179474, 1.5960545539855957, 0.923703134059906, 0.5802040696144104, 0.2868264317512512, 0.29931458830833435, -0.7229024171829224, -2.429213762283325, 0.2884608507156372, 1.09...
1865162836948791299
Quick community showcase! @aditshah00 added an example of running marimo inside @modal_labs, bringing serverless cloud computing power to interactive notebooks! 💪 https://t.co/SDUycwoTOE
https://twitter.com/i/web/status/1865162836948791299
[ 1.9286004304885864, 0.43670013546943665, -2.8277153968811035, -3.910433769226074, 0.7495779991149902, 2.1466991901397705, 1.0774458646774292, 0.1068810373544693, -0.8883339166641235, 2.1014482975006104, -1.1670249700546265, 0.2716890573501587, -0.16907916963100433, -0.21337918937206268, ...
1865184209754525943
One of the best use cases for geospatial data is examining environmental factors. Here's a list of my favourite geospatial environmental datasets: https://t.co/HhN7GuqHG9
https://twitter.com/i/web/status/1865184209754525943
[ 4.145713806152344, 0.8961296677589417, -2.2549750804901123, -2.5679943561553955, -0.8873301148414612, -0.15280723571777344, 3.080845832824707, -1.1227411031723022, 0.49494320154190063, 0.47064095735549927, 1.5917214155197144, 0.8096921443939209, -1.2325468063354492, -1.7758088111877441, ...
1865203415342612732
moka-py A high performance caching library for Python written in Rust. https://t.co/eGICfwfksO
https://twitter.com/i/web/status/1865203415342612732
[ 2.156484365463257, -0.2064889818429947, -1.7748850584030151, -2.762277603149414, 0.9191105961799622, 2.2378172874450684, -0.4639986753463745, -1.1286370754241943, -0.5283827185630798, 0.6821969151496887, -0.2640315592288971, 1.4122806787490845, 0.18080246448516846, 1.5311377048492432, 0....
1865073523896516950
https://t.co/fomVkelxbx
https://twitter.com/i/web/status/1865073523896516950
[ 0.5382225513458252, -0.8595911860466003, -3.2280924320220947, -4.150956630706787, -0.1429397612810135, 0.2698945105075836, 1.5819551944732666, 2.1130638122558594, 0.9035160541534424, -0.2464761734008789, -1.0476776361465454, 0.7037813067436218, -1.0358021259307861, -0.22533230483531952, ...
1865073353196794156
I have got to praise Alibaba for mPLUG DocOwl. What an amazing state of the art tool, and they have open sourced their WHOLE pipeline. Code, Dataset, Weights, Everything. Bravo! Links in thread. https://t.co/zidwL2ZlGx
https://twitter.com/i/web/status/1865073353196794156
[ 0.44533610343933105, -0.4145002067089081, -0.9867087602615356, -2.637251138687134, 1.4870021343231201, 2.686047077178955, 0.3960570693016052, 0.5683489441871643, -1.250717043876648, -0.09192276000976562, -1.3320353031158447, -0.01903865486383438, -0.10840732604265213, -0.8812541961669922, ...
1865184608070824116
Data flywheels are the secret sauce for AI-driven products. Use your users' interactions to continuously improve. The more users, the better the product, the more users. Netflix and Spotify nailed it. You should too. https://t.co/wClR6kBYCW
https://twitter.com/i/web/status/1865184608070824116
[ -2.240278482437134, 0.5952473282814026, -1.1407394409179688, -2.8183372020721436, 0.7800147533416748, 1.5294750928878784, -1.0255076885223389, -0.0164325013756752, 0.7742634415626526, 1.2237919569015503, -0.010188041254878044, -1.2985681295394897, -1.372240662574768, 0.48738428950309753, ...
1865096566467686909
today we are announcing reinforcement finetuning, which makes it really easy to create expert models in specific domains with very little training data. livestream going now: https://t.co/ABHFV8NiKc alpha program starting now, launching publicly in q1
https://twitter.com/i/web/status/1865096566467686909
[ 0.06157713755965233, 2.413179397583008, -1.3406474590301514, -3.0811715126037598, 0.8652238249778748, 1.6039941310882568, -0.5466339588165283, 1.1087727546691895, 0.08753564208745956, 0.24537315964698792, -1.8313907384872437, 0.05801395699381828, -0.5019732117652893, 0.8649920225143433, ...
1865123445220028910
Coming to MLX 🚀 https://t.co/YQhS0IDqtr https://t.co/1ri9OenvdZ
https://twitter.com/i/web/status/1865123445220028910
[ 1.1849926710128784, -1.2456402778625488, -3.257899045944214, -2.4553170204162598, 0.041203975677490234, 0.3712877035140991, 2.2239174842834473, 1.8722413778305054, 0.09239891916513443, -0.4299595355987549, -0.3285350203514099, 0.3325439393520355, -2.206737518310547, 0.30403247475624084, ...
1865088634342523169
"a new 70B model that delivers the performance of our 405B model" is exciting because I might just be able to run a quantized version of the 70B on my 64GB Mac - looking forward to some GGUFs of this https://t.co/6X9wDesnEG
https://twitter.com/i/web/status/1865088634342523169
[ 0.16447895765304565, 0.6904184818267822, -0.7005704045295715, -2.887962579727173, 2.117222785949707, 2.8938915729522705, -0.21498848497867584, -1.1959648132324219, 1.2715234756469727, -0.30672013759613037, -0.12930341064929962, -0.6708284616470337, -1.0504026412963867, -0.39967289566993713...
1865070330286403846
Belg is financieel gelukkig als hij meer dan 5.500 euro netto per maand verdient https://t.co/bcg0AdyMs3
https://twitter.com/i/web/status/1865070330286403846
[ 2.876303195953369, -2.0476644039154053, -2.529730796813965, -3.917426586151123, 2.944241523742676, 1.445269227027893, 0.44565364718437195, 1.330523133277893, -0.7241898775100708, 0.7630322575569153, 1.0895658731460571, 0.8396067023277283, 0.149701327085495, 0.2999117374420166, 1.31360220...
End of preview. Expand in Data Studio
README.md exists but content is empty.
Downloads last month
9

Space using cast42/x_likes_embeddings_potion_base_8M 1