Hacker News

zlwaterfield
Scramble: Open-Source Alternative to Grammarly github.com

Technetium4 months ago

They proclaim "privacy-respecting" but all your keystrokes go to OpenAI. Horrific and genuinely upsetting.

Edit: The author replied to another comment that there is an intent to add local AI. If that is the plan, then fix the wording until it can actually be considered privacy-respecting: https://news.ycombinator.com/item?id=41579144

charlie04 months ago

Lol, this was my second thought immediately after my first, which was one of excitement. Hope the author does add a option for local. Wonder how that would work as a Chrome extension. Doesn't seem like a good idea for extensions to be accessing local resources though.

mdaniel4 months ago

> Doesn't seem like a good idea for extensions to be accessing local resources though.

To the best of my knowledge all localhost connections are exempt from CORS and that's in fact how the 1Password extension communicates with the desktop app. I'd bet Bitwarden and KeePassXC behave similarly

fph4 months ago

You can self-host Languagetool and use it as a Chrome/Firefox extension. The extension talks to a Languagetool server via HTTP, and takes its address as a configurable option. So you just run the local server, and pass localhost:8080 as the server address.

Eisenstein4 months ago

Download koboldcpp and llama3.1 gguf weights, use it with the llama3 completions adapter.

Edit the 'background.js' file in the extension and replace the openAI endpoint with

'http://your.local.ip.addr:5001/v1/chat/completions'

Set anything you want as an API key. Now you have a truly local version.

* https://github.com/LostRuins/koboldcpp/releases

* https://huggingface.co/bartowski/Meta-Llama-3.1-8B-Instruct-...

* https://github.com/LostRuins/koboldcpp/blob/concedo/kcpp_ada...

sharemywin4 months ago

settings for opting out of training etc. for OpenAI

https://help.openai.com/en/articles/7730893-data-controls-fa...

lawlessone4 months ago

Be surprised if thats honored

[deleted]4 months agocollapsed

[deleted]4 months agocollapsed

segmondy4 months ago

much ado about nothing, the code is there, edit it and use a local AI.

contagiousflow4 months ago

But the code as give is said to respect privacy.

Alex43864 months ago

People really should stop calling a glorified openAI API as an open-source software.

jillesvangurp4 months ago

There are several free alternatives to OpenAI that use the same API; which would make it possible to substitute OpenAI for one of those models in this extension. At least on paper. There is an open issue on the github repository requesting something like that.

So, it's not as clear cut. The general approach of using LLMs for this is not a bad one; LLMs are pretty good at this stuff.

dotancohen4 months ago

Yes, but the API at the end is providing the core functionality. Simply swapping out one LLM model for another - let alone by a different company altogether - will completely change the effectiveness and usefulness of the application.

Tepix4 months ago

Well, as we see with AI applications like "Leo AI" and "Continue", using a locally run LLM can be fantastic replacements for proprietary offerings.

dartos4 months ago

FWIW I’ve found local models to be essentially useless for coding tasks.

Tepix4 months ago

Really? Maybe your models are too small?

spmurrayzzz4 months ago

The premier open weight models don't even comparatively perform well on the public benchmarks compared to frontier models. And that's assuming at least some degree of benchmark contamination for the open weight models.

While I don't think they're completely useless (though its close), calling them fantastic replacements feels like an egregious overstatement of their value.

EDIT: Also wanted to note that I think this becomes as much an expectations-setting exercise as it is evaluation on raw programming performance. Some people are incredibly impressed by the ability to assist in building simple web apps, others not so much. Experience will vary across that continuum.

dartos4 months ago

Yeah, in my comparing deepseek coder 2 lite (the best coding model I can find that’ll run on my 4090) to Claud sonnet under aider…

Deep seek lite was essentially useless. Too slow and too low quality edits.

I’ve been programming for about 17 years, so the things I want aider to do are a little more specific than building simple web apps. Larger models are just better at it.

I can run the full deepseek coder model on some cloud and probably get very acceptable results, but then it’s no longer local.

websap4 months ago

Woah woah! Those are fighting words. /s

dartos4 months ago

One would hope, that since the problem these models are trying to solve is language modeling, they would eventually converge around similar capabilities

JCharante4 months ago

everyone stands on the shoulders of giants.

sham14 months ago

Things standing on the shoulders of proprietary giants shouldn't claim to be free software/open source.

t-writescode4 months ago

Their interfacing software __is__ open source; and, they're asking for your OpenAI api key to operate. I would expect / desire open source code if I were to use that, so I could be sure my api key was only being used for my work, so it's only my work that I'm paying for and it's not been stolen in some way.

noduerme4 months ago

My older brother who got me into coding learned to code in Assembly. He doesn't really consider most of my work writing in high level languages to be "coding". So maybe there's something here. But if I had to get into the underlying structure, I could. I do wonder whether the same can be said for people who just kludge together a bunch of APIs that produce magical result sets.

dotancohen4 months ago

  > But if I had to get into the underlying structure, I could.
How do you propose to get into the underlying structure of the OpenAPI API? Breach their network and steal their code and models? I don't understand what you're arguing.

latexr4 months ago

> How do you propose to get into the underlying structure of the OpenAPI API?

The fact that you can’t is the point of the comment. You could get into the underlying structure of other things, like the C interpreter of a scripting language.

robertlagrant4 months ago

But what about the microcode inside the CPU?

zja4 months ago

That tends to not be open source, and people don’t claim that it is.

K0balt4 months ago

I think the relevant analogy here would be to run a local model. There are several tools to easily run local models for a local API. I run a 70b finetune with some tool use locally on our farm, and it is accessible to all users as a local openAI alternative. For most applications it is adequate and data stays on the campus area network.

noduerme4 months ago

A more accurate analogy would be, are you capable of finding and correcting errors in the model at the neural level if necessary? Do you have an accurate mental picture of how it performs its tasks, in a way that allows you to predictably control its output, if not actually modify it? If not, you're mostly smashing very expensive matchbox cars together, rather than doing anything resembling programming.

K0balt4 months ago

As an ancient imbedded system programmer, I feel your frustration… but I think that it’s misguided. LLMs are not “computers”. They are a statistics driven tool for navigating human written (and graphical) culture.

It just so happens to be that a lot of useful stuff is in that box, and LLMs are handy at bringing it out in context. Getting them to “think” is tricky, and it’s best to remember that what you are really doing is trying to get them to talk as if they were thinking.

It sure as heck isn’t programming lol.

Also, it’s useful to keep in mind that “hallucinations “ are not malfunctions. If you were to change parameters to eliminate hallucinations, you would lose the majority of the unusual usefulness of the tool, its ability to synthesise and recombine ideas in statistically plausible (but otherwise random) ways. It’s almost like imagination. People imagine goofy shit all the time too.

At any rate, using agentic scripting you can get it to follow a kind of plan, and it can get pretty close to an actual “train of thought”facsimile for some kinds of tasks.

There are some really solid use cases, actually, but I’d say mostly they aren’t the ones trying to get LLMs to replace higher level tasks. They are actually really good at doing rote menial things. The best LLMs apps are going to be the boring ones.

seadan834 months ago

I think the argument is that stitching things together at a high level is not really coding. A bit of a no true scotsmen perspective. The example is that anything more abstract than assembly is not even true coding, let alone creating a wrapper layer around an LLM

guappa4 months ago

This stuff is starting to enter debian as well -_-'

zlwaterfieldop4 months ago

Plan is to add local LLM support so goal is fully OSS, agree initial wording could have been better.

[deleted]4 months agocollapsed

[deleted]4 months agocollapsed

slg4 months ago

I have been using LanguageTool[1] for years as "an open source alternative to [old school] Grammarly". It doesn't do that fancy "make this text more professional" AI stuff like this or Grammarly can now do, but they offer a self-hosted version so you don't need to send everything you write to OpenAI. If all you want is a better spelling/grammar checker, I highly recommend it.

[1] - https://github.com/languagetool-org/languagetool

dspillett4 months ago

You can also run your own local instance for the in-browser checking, which is handy for me as I need to be careful about sending text off to another company in another country (due to both client security requirements and personal paranoia!).

You don't get the AI based extras like paraphrasing, and the other bits listed in as premium only (https://languagetool.org/premium_new), but if you install the n-gram DB for your language (https://languagetool.org/download/ngram-data/) I found it at least as good as, for some examples better than, Grammarly's free offering last time I did a comparison.

dspillett4 months ago

Replying to self as I'm too late to edit: I left the wrong link for ngram info, the download location instead of the instructions for use which are at https://dev.languagetool.org/finding-errors-using-n-gram-dat...

dontdieych4 months ago

I've downloaded 'ngrams-en-20150817'. Please drop link that can teach me how to apply ngrams file.

Thanks.

dspillett4 months ago

I dropped the wrong link in the original post. The instructions for use are at https://dev.languagetool.org/finding-errors-using-n-gram-dat...

dontdieych4 months ago

Thank you! Got it.

weinzierl4 months ago

It's great. I had a subscription for Grammarly for a couple of years and used both tools in parallel, but found myself mostly using languagetool increasingly. It is strictly better, I'd say even for English but certainly if you need other languages or deal with multilingual documents. So I canceled Grammarly and didn't miss it since.

You also can self-host and we do that at my workplace, because we deal with sensitive documents.

[deleted]4 months agocollapsed

dewey4 months ago

Same, it integrates in all input fields too and has all the browser extensions you need. Non-GitHub landing page: https://languagetool.org

lou13064 months ago

For VSCode users who want to try out LanguageTool, I cannot recommend the LTeX extension [1] highly enough. Setting up a self-hosted configuration is really easy and it integrates very neatly with the editor. It was originally built for LaTeX but also supports Markdown now.

[1]: https://github.com/valentjn/vscode-ltex

isaacfrond4 months ago

And you can write your own custom rules. It's great as a reward for spotting an error in your writing you get to write a tiny little bit of code to spot it automatically next time. I've collected hundreds.

divan4 months ago

Is there a way to add and use niche custom terminology?

isaacfrond4 months ago

I've turned off the spell checker. Spell checking is done just fine in Word so I don't need it there.

herrherrmann4 months ago

You can add your own words to your account, if that’s what you mean!

shahzaibmushtaq4 months ago

How come I have never heard of languagetool before or maybe I have never looked beyond Grammerly. Thank You!

heinrichf4 months ago

There is also an alternative more lightweight self-hosted server in Rust, compatible with the official clients: https://github.com/cpg314/ltapiserv-rs

herrherrmann4 months ago

Absolutely plus one on this. LanguageTool is great and I’m also very happy on the free tier. With the app installed on macOS it also checks mails in the Apple Mail app, for example.

milansuk4 months ago

> It doesn't do that fancy "make this text more professional"

I looked into the Scramble code[0] and it seems there are few pre-defined prompts(const DEFAULT_PROMPTS).

[0] https://github.com/zlwaterfield/scramble/blob/main/backgroun...

Semaphor4 months ago

This explains why I was confused by this. I moved to LT many, many years ago, and didn’t know about those new Grammarly features. So I really wasn’t clear how rewriting a specific text had anything to do with Grammarly.

ktosobcy4 months ago

This! And what's more - it doesn't funnel all what I type to OpenAI so I'd say it's more FOSS than this extension…

dspillett4 months ago

And if you are in a regulatory environment (or elsewhere where data exfiltration paranoia is part of your daily work life), you can install your own instance of the service (sans premium features) and not send your text anywhere outside infrastructure you control.

[deleted]4 months agocollapsed

zlwaterfieldop4 months ago

After years with Grammarly, I wanted a simpler, cheaper way to improve my writing. So I built Scramble, a Chrome extension that uses an LLM for writing enhancements.

Key features: - Uses your OpenAI API key (100% local) - Pre-defined prompts for various improvements - Highlight text and wait for suggestions - Currently fixed to GPT-4-turbo

Future plans: add LLM provider/model choice, custom prompts, bug fixes, and improve default prompts.

It's probably buggy, but I'll keep improving it. Feedback welcome.

GitHub: https://github.com/zlwaterfield/scramble

xdennis4 months ago

> Key features: - Uses your OpenAI API key (100% local)

Sorry, but we have a fundamental disagreement on terms here. Sending requests to OpenAI is not 100% local.

The OpenAI API is not free or open source. By your definition, if you used the Grammarly API for this extension it would be a 100% local, open source alternative to Grammarly too.

zlwaterfieldop4 months ago

Agree, I want to add a local LLM set up. The wording there isn't great.

kylebenzle4 months ago

Without marketing speak can I ask why anyone would have a need for a service like grammerly, I always thought it was odd trying to sell a subscription based spell checker (AI is just a REALLY good spell checker).

gazereth4 months ago

Non-native speakers find it useful since it doesn't just fix spelling but also fixes correctness, directness, tone and tense. It gives you an indication of how your writing comes across, e.g. friendly, aggressive, assertive, polite.

English can be a very nuanced language - easy to learn, difficult to master. Grammarly helps with that.

rlayton24 months ago

I'm a big fan of Grammarly and have been using it, and paying for it, for years.

The advantage is not spell checking. It is grammar and style improvements. It tells you things like "this language is informal", or "this is a better word for that".

mhuffman4 months ago

The "grammar" part, at least in a professional setting. You might be shocked at how many people will write an email pretty much like they would talk to friends at a club or send a text message (complete with emojis!) or just generally butcher professional correspondence.

dotancohen4 months ago

So it may be more attractive to employers to check their employees' output, rather than an individual checking his own?

oneeyedpigeon4 months ago

No, it's also useful to check your own writing. I've used it as both an Editor and a Writer.

socksy4 months ago

It is widely used in countries where the professional language is English, but the native language of the speakers is not.

For example, most Slavic languages don't have the same definite/indefinite article system English does, which means that whilst someone could speak and write excellent English, the correct usage of "a" and "the" is a constant conscious struggle, where having a tool to check and correct your working is really useful. In Greek, word order is not so important. And so on.

Spell check usually just doesn't cut it, and when it does (say, in Word), it usually isn't universally available.

Personally, I have long wanted such a system for German, which I am not native in. Lucky for me DeepL launched a similar product with German support.

A recent example for me was that I was universally using "bekommen" as a literal translation of "receive" in all sentences where I needed that word. Through DeepL I learned that the more appropriate word in a bunch of contexts is "erhalten", which is the sort of thing that I would never have got from a spell check.

Grammarly is notably a Ukrainian founded company.

pbhjpbhj4 months ago

Without marketing speak, can I ask why anyone would have a need for a service like Grammarly?

    ---
Manual corrections here, but maybe they give a clue?

robertlagrant4 months ago

They aren't a native English speaker and would like a hand with phrasing.

lhousa4 months ago

Rookie question: the openAPI endpoint costs extra right? Not something that comes with chatGPT or chatGPT+.

zlwaterfieldop4 months ago

Correct but I'm going to loom into a locally running LLM so it would be free.

Tepix4 months ago

Please do (assuming you mean "look"). When you add support for a custom API URL, please make sure it supports HTTP Basic authentication.

That's super useful for people who run say ollama with an nginx reverse proxy in front of it (that adds authentication).

nickthegreek4 months ago

Look into allowing it to connect to either a LM Studio endpoint or ollama please.

paraknight4 months ago

Yes

Szpadel4 months ago

yes, but gpt-4o-mini costs very little so you probably will spend well under $1/month

miguelaeh4 months ago

I don't think the point here should be the cost, but the fact that you are sending everything you write to OpenAI to train their models on your information. The option of a local model allows you to preserve the privacy of what you write. I like that.

nickthegreek4 months ago

Openai does not train models on data that comes in from the API.

https://openai.com/policies/business-terms/

punchmesan4 months ago

Assuming for the moment that they aren't saying that with their fingers crossed behind their back, that doesn't change the fact that they store the inputs they receive and swear they'll protect it (Paraphrasing from the Content section of the above link). Even if it's not fed back into the LLM, the fact that they store the inputs anywhere for a period of time is a huge privacy risk -- after all a breach is a matter of "when", not "if".

TheRealPomax4 months ago

Does it work in "not a browser" though? Because that's the last place I need this, I really want this in Typora, VS Code, etc. instead.

zlwaterfieldop4 months ago

Not right now. Looking into a mac app. This was just a quick and dirty first go at it.

TheRealPomax4 months ago

Makes sense. Strongly hope it won't be a "mac app" but a cross-platform application instead though, nothing worse than having a great mac app that you can't use 50% of the time because your work computer's a mac and your personal computer's a windows machine because you like playing games.

compootr4 months ago

how much does it cost in a normal day?

Tepix4 months ago

Don't think about money. Think about the cost in terms of forgone privacy.

compootr4 months ago

to protect your privacy from grammarly you fork over your data to openai?

Tepix4 months ago

Hopefully we soon get local llm support

pkhamre4 months ago

What is a normal day?

compootr4 months ago

like what he's spending on average.

Maybe sending some emails, writing or proofreading some docs -- what you'd do in a business day

exe344 months ago

a day when nothing too unusual happens.

_HMCB_4 months ago

This is awesome. Can’t wait to install it and put it through its paces.

remoquete4 months ago

In the same space, I recommend checking out the Vale linter. Fairly powerful and open source, too. And doesn't rely on a backend.

https://vale.sh

loughnane4 months ago

I love vale. I’ve been using it for years. I branched rules from someone trying to emulate the economist style guide and kept tweaking.

I like this approach so much better than leaning on AI because it’s more my “voice”.

https://github.com/loughnane/style

aDyslecticCrow4 months ago

Grammarly is a lifesaver for my day-to-day writing. All it does is correct spelling and punctuation or give rephrase suggestions. But Grammarly does it so unreasonably well that nothing else compares.

Grammarly's core functionality is not even LLM-based; it's older than that. Recently, they've crammed in some LLM features that I don't care a snoot about compared to its core functionality.

This tool, like any other "Grammarly alternative," is just another GPT wrapper to rewrite my text in an overly verbose and soulless way. I was hoping for a halfway-decent spelling corrector.

funshed4 months ago

Absolutely! Being dyslexic, Grammarly is much more than the AI tool that was recently added, which is great, too.

ully3 months ago

I installed the extensio on vivaldi and added an openAI api key, which registered as "saved". But when I click on the extension it still says "API key not set. Please set it in the options."

Can anyone help

vunderba4 months ago

Nice job—I'm always a fan of 'bring your own key' (BYOK) approaches. I think there's a lot of potential in using LLMs as virtual copy editors.

I do a fair amount of writing and have actually put together several custom GPTs, each with varying degrees of freedom to rewrite the text.

The first one acts strictly as a professional editor—it's allowed to fix spelling errors, grammatical issues, word repetition, etc., but it has to preserve the original writing style.

I do a lot of dictation while I walk my husky, so when I get back home, I can run whisper, convert the audio to text, and throw it at the GPT. It cleans it up, structures it into paragraphs, etc. Between whisper/GPT, it saves me hours of busy work.

The other one is allowed to restructure the text, fix continuity errors, replace words to ensure a more professional tone, and improve the overall flow. This one is more reserved for public communique such as business related emails.

edweis4 months ago

> I'm always a fan of 'bring your own key' (BYOK) approaches.

"Bring your own key" has the same amount of syllables as "BYOK"

closetkantian4 months ago

If your point is that BYOK is a useless acronym since it has the same number* of syllables, I disagree. Acronyms aren't just for reducing syllable count; they also reduce visual clutter and are easier to read for people who scan text.

pixelpoet4 months ago

My brother from another mother, I thought I was the only one left who distinguishes much from many. (I wish I didn't know that it's technically an initialism not an acronym...)

closetkantian4 months ago

Hahaha, this comment has me thinking about how I would pronounce it. Bee-yok? Bye-yolk?

copperx4 months ago

I do something similar. I have a custom Gemini Gem that critiques my writing and points out how I can better my paragraphs, but I do the bulk of the rewriting myself.

I'm not a native speaker, and the nice thing about this approach is that I seem to be learning to write better instead of just delegating the task to the machine.

thankyoufriend4 months ago

Very cool! I'd be interested in reading more about your dictation-to-text process if you documented it somewhere, thanks.

My partner and I were just talking about how useful that would be, especially driving in the car when all of the "we should..." thoughts come out of hiding. Capturing those action items more organically without destroying the flow of the conversation would be heavenly.

raverbashing4 months ago

> open-source Chrome extension

> It's designed to be a more customizable and privacy-respecting alternative to Grammarly.

> This extension requires an OpenAI API key to function

I disagree with this description of the service

No, it's not an "Open Source alternative to grammarly", it's an OpenAI wrapper

8f2ab37a-ed6c4 months ago

Wonder if there's an option to somehow pipe the prompting to a local ollama instead.

raverbashing4 months ago

That would be an interesting possibility

zlwaterfieldop4 months ago

Agree, wording could be improved. I'm gonna add local LLM support.

chilipepperhott4 months ago

While Scramble doesn't seem to respect your privacy, a project I've been working on does.

Meet Harper https://github.com/elijah-potter/harper

singhrac4 months ago

I think Harper is very cool, and you should sell it better. It's a local-only low latency & static (no Python) LanguageTool alternative. It doesn't use a large language model.

polemic4 months ago

Seems a stretch to call it open source.

WA4 months ago

Seems a stretch to call it "more privacy-friendly" if it talks to OpenAI.

insane_dreamer4 months ago

Disagree. The fact that it can call another closed-source service doesn't mean that this tool itself is not open source.

senko4 months ago

The source seems to be at the linked repo, and the license is MIT. How’s that a stretch?

trog4 months ago

> The source seems to be at the linked repo, and the license is MIT. How’s that a stretch?

Speaking for myself, I clicked on this thinking it might be open source in the sense of something I can run fully locally, like with a small grammar-only model.

n_plus_1_acc4 months ago

Check out languagetool, as mentioned in other comments. It isbtruly open source

latexr4 months ago

Because it’s a wrapper on a closed-source system.

Imagine writing a shell script that cuts and converts video by calling ffmpeg, would you say it was “a video converter written in bash”? No, the important part would not be in bash, that’s just the thin wrapper used to call the tool and could be in any language. Meaning it would be useless to anyone who e.g. worked on a constrained system where they are not allowed to install any binaries.

Same thing here. If you only run open-source software for privacy reasons, sending all your program data to some closed server you don’t control doesn’t address your issue. There’s no meaningful difference between making an open-source plugin that calls an OpenAI API and one that calls a Grammarly API.

guappa4 months ago

I've seen posts of "js interpreter written in 1 line" that was just a script calling node…

latexr4 months ago

Were those being serious? That sounds like it could’ve been a joke/commentary.

Then again, there are people who genuinely believe they could trivially rewrite curl.

https://daniel.haxx.se/blog/2021/05/20/i-could-rewrite-curl/

guappa4 months ago

Yes I think they were serious, and they used eval() or whatever.

TheDong4 months ago

Code is only copyrightable if it has any element of creativity.

This repo is _only_ really 7 sentences, like "Please correct spelling mistakes in the following text: " (these https://github.com/zlwaterfield/scramble/blob/2c1d9ebbd6b935...)

Everything else is uncreative, and possibly un-copyrightable, boilerplate to send those sentences to OpenAI.

All of the creative software happens on OpenAI's servers using proprietary code.

too_damn_fast4 months ago

Why would you even say 'please' in a prompt ?

t-writescode4 months ago

There has been evidence that better responses are sometimes provided with politeness for some LLMs.

And some people just try to be polite and it only costs a couple tokens.

chaosist4 months ago

I use to say please/thank you to gpt4 in 2023 all the time but it was because I was completely anthropomorphizing the model in various ways.

I suspect it would be just as easy to write a paper that saying please has absolutely no effect on the output. I feel like gpt4 is/was stochastically better on some days and at some hours than others. That might even be wrong though too. The idea that it is provable that "please" has a positive effect on the output is most likely a ridiculous idea.

dotancohen4 months ago

The MIT licensed code is a wrapper for the OpenAI API. That OpenAI API provides the core functionality, and it is not open source.

xdennis4 months ago

The entire codebase is one call to `api.openai.com`.

If I sold you an electrical generator, but the way it worked was by plugging it in, would you say it's fair to say it's a generator?

nucleartux4 months ago

I made the same thing, but it works without ChatGPT key: https://github.com/nucleartux/ai-grammar/

creesch4 months ago

That looks pretty neat, how well does the gemini nano model work for this? Is it just picking up spelling errors or also looking things like punctuation?

nucleartux4 months ago

It actually works pretty well. It fixes all grammar mistakes and punctuation and changes words if they don’t fit. The only downside is that, because it’s a very small model, it sometimes produces completely nonsensical or incomplete responses. I haven’t figured out how to fix this yet.

You can have a look at the screenshots in the repository or on the store page.

Tepix4 months ago

Nice. Can you please add support for contacting your own private OpenAI compatible server (like ollama)?

nucleartux4 months ago

Yes, it's on my roadmap!

rafram4 months ago

Grammarly grammar checking predates modern LLMs by many years, so I assume they’re actually using some kind of rule-based engine internally.

keiran_cull4 months ago

From what I understand, they've used a whole bunch of different kinds of AI models over the years.

They've been reasonably transparent about how things work, e.g. this blog post from 2018: https://www.grammarly.com/blog/transforming-writing-style-wi...

tiew9Vii4 months ago

I was a big fan of Grammarly, as dyslexic, so often write the wrong word then ten minutes later when re-reading spot i used the wrong word/spelling etc.

It worked extremely well, as you say I think by using basic rules engines.

I’ve canceled my subscription recently as found it getting worse, not better, I suspect because they are now applying LLMs.

The suggestions started to make less sense and the problem with LLM suggestions is all your writing takes the tone of the LLM, you loose your personality/style in what you write.

The basic rules approach worked much better for me.

TheRealPomax4 months ago

This pretends that LLMs aren't just "more machine leearning", which they simply are.

conradklnspl4 months ago

How does this compare to https://languagetool.org, which is also open source?

I'm not sure what kind of AI Languagetool uses but it works really well!

patrakov4 months ago

LanguageTool is not open source; it is open core. There are proprietary "premium rules," and you won't get them in a self-hosted version.

dns_snek4 months ago

Self hosted & open core seems distinctly better than an open wrapper around a black box core that's hosted by a 3rd party.

conradklnspl4 months ago

They use the LGPL license for a lot of their work.

https://github.com/languagetool-org/languagetool/blob/master...

bartread4 months ago

> It's designed to be a more customizable and privacy-respecting alternative to Grammarly.

Kind of a shame it says it’s specifically for Chrome then. Where’s the love for Firefox?

daef4 months ago

upping this - I won't install chrome :)

halJordan4 months ago

Seems like it just has some prebaked prompts right now. FF's AI integration does this much already with custom prompts and custom providers. Pls let me set my own base url. So many tools already support the openai api.

All of that to say, this is of course a great addition to the ecosystem.

ichik4 months ago

For me the huge part of Grammarly's magic is that it's not just in the browser, but in any text input on desktop with their desktop app (with some exceptions). Having it only in only in one application just doesn't cut it, especially since it's not my browser of choice. Are there any plans regarding desktop integration. Linux is woefully underserved in this space with all major offerings (Grammarly, Languagetool) having only macOS/Windows versions.

bukacdan4 months ago

I have developed a system-wide writing assistant like you're describing. By design, it has no exceptions to where it works.

Currently, it's only for Mac, but I'm working on an Electron version too (though it's quite challenging).

Check out https://steerapp.ai/

ichik4 months ago

Is the Electron version supposed to be available on Linux? I see only mentions of Windows on the website.

grayxu4 months ago

One strong point of Grammarly comes from its friendly display of diffs (which is somewhat similar to what Cursor does). This project simply uses some predefined prompts to generate text and then replaces it. There are countless plugins that can achieve this, such as the OpenAI translator.

If this tool really wants to compete with Grammarly.

miguelaeh4 months ago

I am a Grammarly user and I just installed Scramble to try it out. However, it does not seem to work. When I click on any of the options, nothing happens. I use Ubuntu 22.04.

Also, to provide some feedback, it would be awesome to make it automatically appear on the text areas and highlight errors like Grammarly does, it creates a much better UX.

zlwaterfieldop4 months ago

Agree - I want to improve the UX, this was just a quick attempt at it. Thanks for the feedback!

miguelaeh4 months ago

You're welcome! Let me know if you plan to integrate local models as mentioned in other comments, I am working on something to make it transparent.

shahzaibmushtaq4 months ago

Grammarly was here before the AI boom, so Grammarly isn't just dependent on AI, but also heavily on HI.

gaiagraphia4 months ago

>Important: This extension requires an OpenAI API key to function. You need to provide your own API key in the extension settings. Please visit OpenAI to obtain an API key.

Obviously not important enough to put in the title, or a submission statement here, though. Curious.

zlwaterfieldop4 months ago

Honestly just an oversight. I want to remove that dependancy anyways with an open source model.

mobscenez4 months ago

That's awesome, Grammarly is good but not as good as large language models such as GPT-4. I have been waiting for a tool that incorporates LLMs into grammar checks for a long time and here it comes! Hope it can integrate Anthropic API in the near future.

isaacfrond4 months ago

Nowadays I just load the whole thing in to chatgpt and it checks the whole thing better than I ever could. You got to be clear what you want do in the prompt. Don't change my writing! only correct errors.

lvl1554 months ago

I am building something similar to Grammarly as a personal project but quickly realized how hard it is to get data in 2024. Contemplating whether I should just resort to pirated data which is just sad.

highcountess4 months ago

I’m just going to remind everyone that all these LLMs were also trained on not just pirated, but all out stolen data in organized and resourced assaults on proprietary information/data, not even to mention roughshod ignoring any and all licenses.

closetkantian4 months ago

To be fair, OpenAI used pirated data

ofou4 months ago

Loved it. I'd love to use something like "right-click, fix grammar" under iOS—not just rewrite. I want to keep my own voice, just with minimal conformant grammar as a second-language speaker.

rafram4 months ago

AFAIK Apple Intelligence will include essentially that.

ofou4 months ago

let's hope the rumors are real

0374 months ago

An alternative from the developer of Coolify. It’s no longer for sale, but the page mentions he’ll open-source it:

https://safetyper.com/

ziddoap4 months ago

Privacy.md needs to be updated.

>If you have any questions about this privacy policy, please contact us at [your contact information].

HL33tibCe74 months ago

This is exactly as open source as a Chrome extension wrapping Grammarly’s API would be, i.e. not at all.

janandonly4 months ago

I am currently paying for LaguageTool but I will definitely give this open source software a try !

reify4 months ago

I also use LanguageTool

easy to install in LibreOffice

nik7364 months ago

How is it more privacy respecting when it's sending stuff to OpenAI servers?

reynaldi4 months ago

Awesome, I was just about to look for something like this and it showed up on HN!

the_arun4 months ago

Do we need OpenAI for this? Can’t we have an LLM sitting locally work?

kirso4 months ago

Is there something like that for VSCode?

Festro4 months ago

So it doesn't provide realtime feedback on your writing within a dialog box like Grammarly does? It's just a (non-open source) OpenAI set of pre-written prompts?

Come on.

Pitch this honestly. It'll save me clicks if I'm using an LLM to checker grammar already, but if I use Grammarly it's not an alternative at all. Not by a long way.

lccerina4 months ago

It uses OpenAI, so it's not open source. Keep this shit away from me.

mproud4 months ago

F*ck Grammarly.

peterweyand384 months ago

[dead]

peterweyand384 months ago

[dead]

hn-front (c) 2024 voximity
source